Featured Archives - Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors https://solutionsreview.com/backup-disaster-recovery/category/featured/ Solutions Review Tue, 14 Jan 2025 17:47:15 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.2 https://solutionsreview.com/backup-disaster-recovery/files/2024/01/cropped-android-chrome-512x512-1-32x32.png Featured Archives - Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors https://solutionsreview.com/backup-disaster-recovery/category/featured/ 32 32 The 16 Best Data Protection Software Companies for 2025 https://solutionsreview.com/backup-disaster-recovery/the-best-data-protection-software-offerings/ Wed, 01 Jan 2025 18:12:32 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=4231 Solutions Review’s listing of the best Data Protection Software offerings is an annual mashup of products that best represent current market conditions, according to the crowd. Our editors selected the best Data Protection Software based on each platform’s Authority Score, a meta-analysis of real user sentiment through the web’s most trusted business software review sites, […]

The post The 16 Best Data Protection Software Companies for 2025 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

Solutions Review’s listing of the best Data Protection Software offerings is an annual mashup of products that best represent current market conditions, according to the crowd. Our editors selected the best Data Protection Software based on each platform’s Authority Score, a meta-analysis of real user sentiment through the web’s most trusted business software review sites, and our own proprietary five-point inclusion criteria.

Data protection is a broad field, encompassing backup and disaster recovery, data storage, business continuity, cybersecurity, endpoint management, data privacy, and data loss prevention. Data protection software becomes more essential as the amount of data an enterprise creates and stores continues to grow at ever-increasing rates. The primary goals of a comprehensive data protection strategy are to ensure data privacy and to enable organizations to quickly restore their data after experiencing a disaster.

Data protection strategies are developing around two concepts: data availability and data management. Data availability ensures that users have access to the data they need to maintain day-to-day business operations at all times, even in the event that data is lost or damaged. With regard to data management, the two sections of that technology crucial to data protection software are data lifecycle management and information lifecycle management. These capabilities facilitate the automation of moving critical data to online and offline storage, and creating comprehensive strategies for valuing, cataloging, and protecting data from application errors, user errors, malware, virus attacks, outages, machine failure, and other disruptions.

Note: The best data protection software are listed in alphabetical order.

The Best Data Protection Software

Acronis offers backup, disaster recovery, and secure file sync and share solutions. The company also provides data protection in any environment, including virtual, physical, cloud, and mobile. Acronis True Image 2020 is personal backup software that enables users to duplicate their system, effectively capturing all of their data for system recovery or disk migration. Acronis Cyber Backup is aimed towards businesses of all sizes and offers proactive ransomware protection. Recently, Acronis acquired 5nine Software, CyberLynx, and DeviceLock. The vendor also released Acronis Cyber Protect, which natively integrates cybersecurity, data protection, and data management to protect endpoints, systems, and data.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Amazon Web Services (AWS) offers a range of IT infrastructure services to enterprises. In addition to storage, the provider’s solutions and products include cloud computing, compute, networking, content delivery, databases, analytics, application services, backup, compliance, data resiliency, data lifecycle management, hybrid cloud backup, and archive.  AWS provides a variety of cloud storage solutions, such as Amazon Elastic Block Store (Amazon EBS), Amazon Simple Storage Service (Amazon S3), and AWS Backup, among others. AWS also offers data-transfer methods and networking options to build solutions that protect data with durability and security. The vendor’s various platforms, including the AWS Partner Network for Storage and Backup and Evolving Backup into Archive and Disaster Recovery, support both application and archival compliance requirements.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Link to Arcserve

Arcserve offers several different backup products, including Arcserve Unified Data Protection (UDP), Arcserve Replication and High Availability, Arcserve UDP Cloud Direct, UDP Cloud Hybrid, and a legacy offering. UDP provides comprehensive Assured Recovery for virtual and physical environments with a unified architecture, backup, continuous availability, migration, email archiving, and an easy-to-use console. The product enables organizations to scale their IT environments easily while delivering against recovery point and recovery time objectives, on-prem, or in the cloud. It also allows for the automated disaster recovery testing of business-critical systems, applications, and data, without business downtime or impact on production systems. Recovery testing can be fully automated or performed on a scheduled basis.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Link to Asigra

Asigra is built for cloud computing environments and designed to offer backup efficiencies by allowing enterprises to capture, ingest, and store less data. Designed for compatibility with public, private, and hybrid cloud architectures, the Asigra Cloud Backup solution is equipped with agentless software architecture, global deduplication, and data compression technology along with NIST FIPS 140-2 certified security. Asigra also offers ransomware protection, business continuity, and compliance management. These platforms offer bi-directional malware detection, deep MFA, immutable retention, and variable repository naming. In addition, the vendor reduces recovery time objectives and eliminates silos of backup data.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Carbonite Logo

Carbonite offers enterprise cloud-based backup, recovery, and storage solutions. The Carbonite Data Protection Platform allows organizations to deploy the right form of protection for each type of data in their systems—from long-term backup to rapid recovery, data migration, and endpoint protection. The vendor’s Cloud Disaster Recovery centralizes data backup and recovery on all computers distributed throughout an organization’s locations. Carbonite also offers cloud data protection, which protects against accidental deletion, theft, hardware failure, and data corruption. Users are enabled to retain business data in order to meet compliance requirements. Agents automatically back up the data over the internet to a highly secure data center. Additionally, in December 2019, OpenText acquired Carbonite.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Image result for clumio logo

Clumio provides Software as a Service enterprise backup. Through this secure service, enterprises can eliminate hardware and software for on-prem backup and avoid the complexity and cost of running third-party data protection software in the cloud. The vendor’s data protection solution, RansomProtect (released in late 2020) provides an air-gapped and immutable backup platform that protects data regardless of what cloud in which it resides. The solution also provides immutable storage, end-to-end encryption, multi-factor authentication, and meets ISO 27001, PCI, AICPA SOC, and HIPAA certification and compliance designations.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Link to Cobalt Iron

Cobalt Iron’s flagship SaaS-based backup solution, Compass, reduces complexity and the amount of time spent on backup. Additionally, the software improves overall data protection performance. The enterprise data protection platform offers four main product components: Commander, Analytics Engine, Accelerators, and Accelerator Operating System. With this solution, users can access a range of analytics, driven data management capabilities through a unified web user experience. Additionally, the solution offers ransomware detection, alerting and notification, remediation capabilities, and is available in AWS, Azure, Google, IBM, and Alibaba.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Link to Code42

Code42 offers backup, disaster recovery, and data loss protection solutions. The provider’s data loss protection solution (Next-Gen DLP) detects insider threats, satisfies regulatory compliance requirements, automatically monitors file activity across computers and the cloud, and facilitates incident response. Additionally, Code42’s security, IT, and compliance professionals can protect endpoint and cloud data from loss, leak, and theft while maintaining a collaborative culture for employees. This solution is recommended for backing up user data in the cloud or on-prem for compliance and simple recovery. Code42 also offers Incydr, which is a solution that provides data risk detection and response for insider threats, as well as CrashPlanCloud, which provides endpoint data backup and recovery for enterprises.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Download Link to DRaaS Buyer's Guide

Link to Cohesity

Cohesity is a data management company that manages, protects, and extracts value from enterprise data. The provider’s flagship tool, Cohesity DataProtect, safeguards a wide range of data sources on a single web-scale platform. The solution can be deployed on-premises on qualified platforms in the data center, public cloud, and on the edge. Additionally, the platform utilizes a scale-out architecture that starts with a minimum of three nodes and scales without disruption by adding nodes to the cluster. Through this solution, users have the ability to use backup data directly on the platform without needing to restore it, which allows for the consolidation of other use cases, including dev/test and analytics.  Recently, Cohesity raised $250 million in Series E funding.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Commvault-Logo

Commvault provides data protection and information management software to help organizations protect, access, and use all of their data economically. The vendor has a long list of supported public cloud providers, hypervisors, big data support, and database protection. The platform is primarily offered as a software-only solution, but Commvault also has an appliance option and an enterprise-grade SaaS offering for backup and recovery through Metallic. The provider offers Commvault Complete Data Protection, which is an all-in-one solution combining Commvault Backup & Recovery with Commvault Disaster Recovery for enterprise-level data protection software. The solution provides backup, replication, disaster recovery orchestration, copy data management, scale-out architecture, ransomware protection, migration support for data and application, and a web-based user interface.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Link to Dell EMC

Dell EMCs backup and recovery solution is a prepackaged backup suite made up of several different components available for individual sale. These include Avamar, Networker, and Data Protection Advisor,  and the new PowerProtect DP series appliances with options for cloud backup and archiving. Dell EMCs solutions give a full range of data protection, from archive to continuous availability for physical, virtual, and cloud environments. The provider enables digital transformation through hybrid cloud and big-data solutions built on a modern data infrastructure, bringing together converged infrastructures, storage, and servers. The recently launched all-in-one Integrate Data Protection Appliance combines Data Domain with Avamar and DD Boost for Enterprise Applications. 

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Link to Druva

Druva delivers data protection and management for the cloud era. Druva Cloud Platform is built on AWS and offered as-a-Service. Druva Phoenix simplifies data protection, improves visibility, and significantly reduces the risk, cost, and effort of managing complex data. The solution operates seamlessly and can be managed from one location, giving IT administrators full visibility and control over server backups and data composition. Phoenix also offers long-term retention and archiving, enterprise-level RTO and RPO, and enterprise-grade security.Druva is used worldwide by over 4,000 companies at the forefront of embracing the cloud. Additionally, Druva recently acquired SFApex for an undisclosed amount. 

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Link to HYCU

HYCU specializes in multi-cloud data backup, management, migration, protection, and recovery for on-premises and hyper-converged (HCI), Google Cloud, Azure Cloud, and multi-cloud infrastructures. Headquartered in Boston, Mass., HYCU harnesses 25 years of sophisticated IT experience, insights from over one million users, and works with more than 25,000 customers worldwide. The result is alignment with industry leaders and a competitive advantage in the multi-cloud space. HYCU’s flagship products, a purpose-built Data Protection solution for Nutanix, a managed Data Protection as a Service for Google Cloud Platform and Azure Cloud, and HYCU Protégé, a Multi-Cloud Data Protection Solution offer one-click cross-cloud migration, disaster recovery, and consolidated management. 

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Link to Infrascale

Infrascale offers an enterprise-grade cloud-based data protection solution that provides failover to a second site with the flexibility to boot from the appliance or cloud. The product is delivered as a physical or virtual appliance and includes disaster recovery software. Infrascales dashboard simplifies data protection management by providing a single view through which the entire suite of services is deployed. An administrative dashboard, accessible from any browser or device, makes it easy to recover missioncritical applications and systems with pushbutton simplicity. Users are enabled to set up the protection needs for their organization in a single pane of glass management to ensure all of their critical data is covered.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Image result for nightfall ai logo

Nightfall AI utilizes machine learning to identify business-critical data across SaaS, APIs, and data infrastructure so it can be protected and managed. The vendor offers data protection software for Slack, Google Drive, GitHub, Confluence, Jira, and Amazon Web Services. Additionally, Nightfall AI’s data protection solution enables users to achieve HIPAA compliance, stop data exfiltration, protect credentials and critical data, prevent insider threats, secure customer and employee data, and implement content moderation policies. In late 2019, the provider raised $20.3 million in Series A funding. in late 2020, Nightfall AI joined CircleCI’s partner network.

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Link to StorageCraft

StorageCraft offers backup, disaster recovery, and business continuity solutions for servers, desktops, and laptops. The vendor’s data protection solutions, ShadowXafe, OneXafe Solo, ShadowProtect, Granular Recovery for Exchange, ShadowProtect IT Edition, and File Backup and Recovery, reduce downtime, improve security and stability for systems and data, and lower the total cost of ownership. StorageCrafts business focuses on data protection and restoration tools that are offered via value-added and channel partners. However, it also provides scale-out storage, replication, recovery, integrated data protection, and more. The solution is supported by on-prem and cloud-based environments, as well as hybrid deployments. 

Learn more and compare products with the Solutions Review Data Protection Buyer’s Guide.

Download link to Data Protection Vendor Map

The post The 16 Best Data Protection Software Companies for 2025 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
87 Data Protection Predictions from 46 Experts for 2024 https://solutionsreview.com/backup-disaster-recovery/data-protection-predictions-from-experts-for-2024/ Thu, 07 Dec 2023 16:42:53 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=6296 For our 5th annual Insight Jam LIVE! Solutions Review editors sourced this resource guide of data protection predictions for 2024 from Insight Jam, its new community of enterprise tech experts. Note: Data protection predictions are listed in the order we received them. Data Protection Predictions from Experts for 2024 Bobby Cornwell, Vice President Strategic Partner […]

The post 87 Data Protection Predictions from 46 Experts for 2024 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>

For our 5th annual Insight Jam LIVE! Solutions Review editors sourced this resource guide of data protection predictions for 2024 from Insight Jam, its new community of enterprise tech experts.

Note: Data protection predictions are listed in the order we received them.

Data Protection Predictions from Experts for 2024


Bobby Cornwell, Vice President Strategic Partner Enablement & Integration at SonicWall

Expect to See New Regulations for Reporting Breaches

“In 2024, incoming cybersecurity regulations will force businesses to be more transparent about their breaches and attacks. Forthcoming legislation such as the EU’s NIS2 Directive and the Cyber Resilience Act will impose more stringent standards for cyber protection and establish clear reporting timelines in the event of a breach. As these directives take effect, businesses will be made to share with their partners and suppliers early identifications of system vulnerabilities or face fines. The aim of this is to prevent cybercriminals from inflicting widespread damage across multiple businesses. In 2024, it will be crucial to optimize the transparency afforded by these regulations, and by dragging cybercriminals out into the open, authorities can more effectively curtail their illicit activity.”

Samir Zaveri, Practice Manager – Package Led Transformation at Wipro

Future of Data Protection in Hybrid Cloud Deployments

“On one hand, hybrid cloud adoption will continue to grow exponentially and on other hand, organizations are looking to repatriate workloads back to private clouds. Data protection in a single cloud environment was already a challenge and with data distributed across multiple clouds and cloud service provider,s the challenge has grown even more. Today, organizations are having a one-to-one mapping between source clouds and data backup, and disaster recovery sites lead to multiple standard operating procedures and multiple points of data thefts along with inconsistent recovery SLAs.”

To overcome these challenges, organizations will need to adopt new capabilities

“Full workload portability is one of them. With portability, organizations have the ability to deploy workloads across different cloud service providers without having to adapt to each environment and with no changes needed to the application or the infrastructure. Portability will give organizations a way to consolidate multiple sources into one single data backup and disaster recovery site, as well as consolidate standard operating procedures (SOPs), all with consistent recovery SLAs.”

Eyal Arazi, Cloud Security Manager at Radware

Migration to the cloud will slow down as companies reverse course

“During the past few years, there has been a rapid adoption of multi-cloud strategies, with organizations often using three, four and even five different cloud environments. The problem, however, is that while organizations accumulated more cloud platforms, they ignored the problems of cross-cloud security consistency, visibility and management.

A recent survey commissioned by Radware suggests that now organizations are starting to reverse course. Despite the ongoing discussion about “the great cloud migration” and the abandonment of on-premises environments, approximately three quarters of organizations not only still use these environments but expect usage to increase in the next 12 months. Based on the report, look for more companies to consolidate their cloud environments from three or more to one or two platforms in 2024. While there will be consolidation around the cloud, most organizations will continue to maintain a combination of public cloud and on prem platforms, resulting in a hybrid environment.”

Andrew Moloney, Chief Strategy Officer at SoftIron

Cloud strategy moves from “fashionable to rational” 

“Moving from an era when proposing a full-scale migration to the public cloud was a sure-fire way to promotion, the current maturation of the market, economic conditions, shifting performance requirements, and a dramatic simplification in building private cloud-native infrastructure, will see a much more rational approach. Underpinning this will be a broader understanding of the difference between cloud “the model” and cloud the “place”, where how applications are built is decoupled from where they operate.”

A sovereign cloud shake out 

“We predict that many of the “pseudo” sovereign cloud projects – those that rely on obfuscated infrastructure and/or local third parties to operate them to provide a veneer of sovereignty, will not gain traction. AWS, late to the party to offer such a service and having recently launched their European Sovereign Cloud may well be delivering too little, too late. Instead, those that offer true sovereign resilience – enabling nation-states to build, operate, inspect, and audit their own infrastructure on their own terms and turf, will become the preferred option.”

VMware acquisition accelerates the adoption of cloud-native infrastructure 

“Forced into seeking credible alternatives to using VMware to provide virtualized infrastructure in on-prem. data centers, existing VMware customers will take the opportunity to revisit their cloud strategy, making the rational decision to shift to a fully cloud-native infrastructure – one able to consolidate and simplify existing virtualized on-prem. workloads within an infrastructure able to deliver true private cloud going forward will grasp that opportunity. Finally, they will be able to deliver what VMware and Nutanix have promised for years but have never quite been able to deliver.”

A renaissance for Private Cloud 

“Partially related to our VMWare prediction, and the availability of cloud-native infrastructure that changes the economics of private cloud, the evolution of a more rational cloud strategy will see Cloud Centers of Excellence (CCoEs) and FinOps professionals grasp the opportunity to get an apples-to-apples comparison across not just public clouds, but now between public and private cloud. New open standards released in 2024, such as FOCUS will help to enable this.

At the same time, shifts to distributed cloud architectures, enabling workloads to move to the edge to the core and back will elevate the need to make private clouds more than just basic virtualized infrastructure.”

The death of “Hyper-Converged” 

“Already effectively abandoned by its greatest exponent, Nutanix, the independent and elastic scaling limitations inherent in these architectures, plus their failures to fully deliver on a fully cloud-native environment without significant integrations and third parties will see hyperconvergence relegated from the data center to smaller, departmental type solutions only.”

Software-defined fades, hardware, and hard tech get sexy again 

“Fuelled by the hype around AI and the investments being made in the processing power to support it, we predict we’ll see a resurgence in interest in innovation right in hardware, and the hard tech required to support that. A new generation of start-ups will disrupt the inertia in innovation in IT infrastructure design of the last couple of decades.”

Thomas Chauchefoin, Vulnerability Research at Sonar

AI-Assisted attacks to become more sophisticated and automated

“IT security attacks leveraging AI are expected to become more sophisticated and automated. Hackers will likely use AI to analyze vast amounts of data and launch targeted attacks. AI-driven phishing attackers capable of generating highly convincing and personalized messages, which trick users into revealing sensitive information, may increase. Furthermore, AI-powered malware could adapt and evolve in real time, making it more challenging for traditional antimalware detection systems to keep up.”

Stacy Hayes, Co-Founder and EVP, Americas at Assured Data Protection

“The managed services model will become increasingly attractive to traditional VARs in 2024, especially with more and more businesses looking to buy cloud and IT services on a usage basis. But making the transition from a traditional VAR to a provider of managed services is easier said than done. It’s not that VARs aren’t capable of diversifying, far from it, it’s just that the switch requires a fundamental shift in the way VARs do business and that isn’t something you can just change overnight. These large organizations are not built for this new world model. The in-house build and integration of new technology and go-to-market models takes too long and is too expensive to implement. VARs simply don’t have the people, the flexibility or the know how. With the economic headwinds as they are, Opex is king and no-one has the Capex or the appetite for big in-house builds.

It is becoming increasingly difficult for VARs to provide a large portfolio of products and services to the standards customers demand. The speed the market moves, the reliance on data, all add to greater demands from customers. It is evident channel businesses are struggling to deliver what their customers want, whether it be on-premises or in the cloud. It is a common topic and one I believe means VARs need to clearly understand what they can deliver themselves, and what they need to outsource. Outsourcing, white labelling, is a great way to deliver a high quality and diverse portfolio to customers.

MSPs that have the know-how to use utility based models effectively, that can execute immediately, have experts in the space and deliver services tailored for the vendor, customer, end user will be the partners of choice for VARs in 2024.”

Jim Liddle, Chief Innovation Officer at Nasuni

2024 will be a make-or-break year for data intelligence  

“Following the booming interest in AI in 2023, enterprises will face increased pressure from their boards to leverage AI to gain a competitive edge. That rush for an AI advantage is surfacing deeper data infrastructure issues that have been mounting for years. Before they can integrate AI effectively, organizations will first have to address how they collect, store, and manage their unstructured data, particularly at the edge.

AI doesn’t work in a vacuum and it’s just one part of the broader data intelligence umbrella. Many organizations have already implemented data analytics, machine learning and AI into their sales, customer support, and similar low-hanging initiatives, but struggle to integrate the technology in more sophisticated, high-value applications. 

Visibility, for example, is a crucial and often-overlooked first step towards data intelligence. A shocking number of companies store massive volumes of data simply because they don’t know what’s in it or whether they need it. Is the data accurate and up-to-date? Is it properly classified and ‘searchable’? Is it compliant? Does it contain personal identifiable information (PII), protected health information (PHI), or other sensitive information? Is it available on-demand or archived?  

In the coming year, companies across the board will be forced to come to terms with the data quality, governance, access, and storage requirements of AI before they can move forward with digital transformation or improvement programs to give them the desired competitive edge.” 

2024 will be the year of reckoning for both ransomware and compliance 

“The risk of ransomware and sophisticated attacks is ever-growing and will continue to spread internationally in 2024. Preventing the theft, encryption, misuse, or exposure of sensitive data will remain a daily concern for organizations indefinitely. Multi-layer protection has quickly become a matter of hygiene and even companies that invested in sophisticated, global ransomware protection products will need a belt and braces approach in the form of network, application, and access security, coupled with rapid data recovery solutions. 

Ransomware has typically been more prevalent in the US, with larger organizations and their larger data sets presenting more attractive targets for bad actors. In 2024, we’ll see more ransomware incidents in the UK as government agencies, health services, and critical infrastructure in both countries continue to lack the technology and funding to build adequate data protection and recovery capabilities. 

Organizations that haven’t addressed their data protection and recovery posture are now risking both security and compliance headaches, as regulatory penalties and recovery costs often outmatch ransom payouts. Europe still leads in data governance and regulation with the likes of GDPR, but legislation like the California Consumer Privacy Act (CCPA) is quickly spreading across the US. By delaying their investment in protection and compliance solutions until forced to, many large organizations will soon face the possibility of steep penalties, ransom demands, and business disruption simultaneously.” 

Russ Kennedy, Chief Product Officer at Nasuni

Enterprises will embrace hybrid infrastructure or fall behind 

“The next revolution in data will occur at the edge. After years of conflicting definitions and uncertainty, today’s leading businesses are realizing the necessity of truly hybrid infrastructure. To remain competitive in a data-driven world, enterprises need high performance processing at the edge, where data is generated, in combination with the scale, capacity, and advanced tools available in the cloud.  

Traditionally, large companies have used legacy storage vendors and traditional backup solutions to store and protect petabyte volumes of data. These legacy infrastructures are a performance bottleneck and can’t support the pace of growth, as analyst William Blair recently highlighted.  

Over the next few years, we’ll see more organizations realize it’s not one or the other, but a combination of edge and cloud storage. According to Gartner, 50 percent of critical infrastructure applications will reside outside of the public cloud through 2027. Manufacturers, for example, need to quickly capture and consolidate the critical data coming from their physical systems and processes across the world, while keeping and leveraging that data for analytics year after year. Ready or not, we’ll see this edge-cloud mechanism force organizations to adopt and embrace truly hybrid infrastructure and ultimately transform their ability to drive more effective innovations and respond in a more agile way to customer’s evolving needs.” 

Organizations will continue to grapple with data infrastructure to support hybrid work long after the pandemic 

“The genie is out of the bottle and hybrid or remote is here to stay. Though the greatest economic upheavals have hopefully passed, we’re seeing the residual effects. Many companies are still trying to design or optimize infrastructure to accommodate hybrid work and reconfigured supply chains.  

Though organizations worked quickly to spin up the necessary systems, they simply weren’t designed to support thousands of remote workers. Inevitably, workers started using whatever tools necessary to collaborate, and many businesses saw a significant increase in shadow IT tools outside of sanctioned corporate IT programs. As we enter 2024, IT organizations are still grappling with the effects of remote work on top of mounting pressure to reduce costs and regain control of their disparate and sprawling corporate data assets. 

Some have tried to remedy the issue by mandating employees back into the office, but to attract and retain appropriate talent, businesses will need to provide enhanced multi-team collaboration options and the data infrastructure to scale it. Those that have the right data access solutions in place to streamline processes and remote collaboration will succeed in the hybrid work economy.” 

Matt Waxman, Senior Vice President and GM for Data Protection at Veritas Technologies

The first end-to-end AI-powered robo-ransomware attack will usher in a new era of cybercrime pain for organizations

“Nearly two-thirds (65 percent) of organizations experienced a successful ransomware attack over the past two years in which an attacker gained access to their systems. While startling in its own right, this is even more troubling when paired with recent developments in artificial intelligence (AI). Already, tools like WormGPT make it easy for attackers to improve their social engineering with AI-generated phishing emails that are much more convincing than those we’ve previously learned to spot. In 2024, cybercriminals will put AI into full effect with the first end-to-end AI-driven autonomous ransomware attacks. Beginning with robocall-like automation, eventually AI will be put to work identifying targets, executing breaches, extorting victims and then depositing ransoms into attackers’ accounts, all with alarming efficiency and little human interaction.”

Targeted cell-level data corruption will make ransomware more dangerous than ever

“As more organizations become better prepared to recover from ransomware attacks without paying ransoms, cybercriminals will be forced to continue evolving. In 2024, we expect hackers to turn to targeted cell-level data corruption attacks—code secretly implanted deep within a victim’s database that lies in wait to covertly alter or corrupt specific but undisclosed data if the target refuses to pay a ransom. The real threat is that victims will not know what data—if any, the hackers could be bluffing—has been altered or corrupted until after the repercussions set in, thus effectively rendering all their data untrustworthy. The only solution is to ensure they have secure copies of their data that they are 100 percent certain are uncorrupted and can be rapidly restored.”

Adaptive data protection will autonomously fight hackers without organizations lifting a finger

“More than two-thirds of organizations are looking to boost their cyber resiliency with the help of AI. But, given AI’s dual nature as a force for both good and bad, the question going forward will be whether organizations’ AI-powered protection can evolve ahead of hackers’ AI-powered attacks. Part of that evolution in 2024 will be the emergence of AI-driven adaptive data protection. AI tools will be able to constantly monitor for changes in behavioral patterns to see if users might have been compromised. If the AI detects unusual activity, it can respond autonomously to increase their level of protection. For example, initiating more regular backups, sending them to differently optimized targets and overall creating a safer environment in defense against bad actors.”

Generative AI-focused data compliance regulations will impact adoption

“For all its potential use cases, generative AI also carries heavy risks, not the least of which are data privacy concerns. Organizations that fail to put proper guardrails in place to stop employees from potentially breaching existing privacy regulations through the inappropriate use of generative AI tools are playing a dangerous game that is likely to bring significant consequences. Over the past 12 months, the average organization that experienced a data breach resulting in regulatory noncompliance shelled out more than US$336,000 in fines. Right now, most regulatory bodies are focused on how existing data privacy laws apply to generative AI, but as the technology continues to evolve, expect generative AI-specific legislation in 2024 that applies rules directly to these tools and the data used to train them.”

For every organization that makes the jump to the cloud, another will develop an on-premises datacenter as hybrid cloud equilibrium sets in

“The percentage of data stored in the cloud versus on-premises has steadily grown to the point where it is estimated that 57 percent of data is now stored in the cloud with 43% on-premises. That growth has come from both mature companies with on-premises foundations making the jump to the cloud, and newer companies building their infrastructure in the cloud from the ground up. But both categories of organizations are learning that, for all its benefits, the cloud is not ideally suited for all applications and data. This is leading many companies that made the jump to the cloud to partially repatriate their data and cloud-native companies to supplement their cloud infrastructure with on-premises computing and storage resources. As a result, in 2024, we’ll see hybrid cloud equilibrium—for every organization that makes to the move to the cloud, another will build an on-premises datacenter.”

Cassius Rhue, VP of Customer Experience at SIOS Technology

Application high availability becomes universal

“As reliance on applications continues to rise, IT teams will be pressured to deliver efficient high availability for applications once considered non-essential. Once reserved for mission-critical systems, such as SQL Server, Oracle, SAP, and HANA, application high availability – typically delivered with HA clustering technology – will become a requirement for more systems, applications, and services throughout the enterprise.”

Cloud and OS agnostic high availability becomes an expected requirement for most applications

“IT teams will look for application HA solutions that are consistent across operating systems and cloud reducing complexity and improving cost-efficiency. As the need for HA rises, companies running applications in both on-prem and cloud environments as well as those running applications in both Windows and Linux environments will look to streamline their application environments with HA solutions that deliver a consistent user interface across all of their environments and also for matching cloud and OS technical support and services from the HA vendor.”

The trend toward migration to SAP HANA is likely to continue in 2024

“The mandatory 2027 migration will push more companies to migrate to SAP HANA. As companies migrate to SAP HANA there will be an increased need for more sophisticated and flexible high availability and disaster recovery solutions that help them bridge the gap between existing systems and the new, more modern systems that take advantage of SAP HANA’s capabilities. Organizations will look for HA solutions that help them find ways to take advantage of emerging technologies and accelerate digital transformation, while not losing the HA and DR capabilities that continue to arise.”

Automation becomes more common in high availability and disaster recovery efforts as data and analytics increase complexity

“As the volume and variety of data as well as the channels through which data are collected increase, organizations will require more information about why faults/failures occurred and how to address potential issues. Automation and orchestration tools will play a central role, streamlining root cause analysis, improving intelligent responses, and enhancing disaster recovery processes to further reduce downtime and enhance data availability.”

The focus on data security and compliance will intensify

“The focus on data retention, security, and access controls will intensify prompting organizations to integrate enhanced security measures deeper into their high availability and disaster recovery solutions, services, and strategies. As the volume and variety of data as well as the channels through which data are collected and processed increase, organizations will require more security measures to be baked into their solutions.”

Sophisticated storage and DR strategies will become crucial to the demands of an increasingly dynamic and data-driven business landscape

“As the volume of unstructured data continues to surge, storage solutions are expected to prioritize scalability, tiered performance, and accessibility. Enterprises will also adopt more sophisticated and resilient DR strategies using multiple high availability (HA) nodes, and DR technologies that understand the complexity of tiered storage solutions. Cloud storage is expected to continue its ascendancy, with organizations increasingly relying on scalable and flexible cloud solutions to meet their expanding data requirements. At the same time, a growing number of companies will look to move workloads out of the cloud to on-prem environments in favor of more predictable costs and greater control over their environments.”

Justin Borgman, Co-Founder and CEO at Starburst

All things make a comeback and on-prem storage is having a resurgence

“Companies including Dell have heavily invested in their EMC portfolio. Enterprise customers will continue to recognize that enhancing on-premise storage hardware presents the faster path to mitigating rising cloud expenses. This modernization will allow companies to manage data gravity for on-premise data that cannot be easily relocated, ensuring a more efficient approach.”

Haoyuan Li, Founder and CEO at Alluxio

Hybrid and Multi-cloud Acceleration

“In 2024, the adoption of hybrid and multi-cloud strategies is expected to accelerate, both for strategic and tactical reasons. From a strategic standpoint, organizations will aim to avoid vendor lock-in and will want to retain sensitive data on-premises while still utilizing the scalable resources offered by cloud services. Tactically, due to the continued scarcity of GPUs, companies will seek to access GPUs or specific resources and services that are unique to certain cloud providers. A seamless combination of cross-region and cross-cloud services will become essential, enabling businesses to enhance performance, flexibility, and efficiency without compromising data sovereignty.”

From Specialized Storage to Optimized Commodity Storage for AI Platform

“The growth of AI workloads has driven the adoption of specialized high-performance computing (HPC) storage optimized for speed and throughput. But in 2024, we expect a shift towards commoditized storage. Cloud object stores, NVMe flash, and other storage solutions will be optimized for cost-efficient scalability. The high cost and complexity of specialized storage will give way to flexible, cheaper, easy-to-manage commodity storage tailored for AI needs, allowing more organizations to store and process data-intensive workloads using cost-effective solutions.”

Jimmy Tam, CEO at Peer Software

Active-Passive High Availability Practices Evolve – Active-Active Has its Moment

“Without continuous availability and real-time access to data, businesses risk losing out to competitors, making decisions with inaccurate information, and more. So it is no wonder that CIOs are starting to demand more from their data centers. In the coming 12 months, it is likely that many IT leaders will start to adopt active-active capabilities, improving performance by distributing the workload across several nodes to allow access to the resources of all servers. 

By moving away from active-passive technologies that simply don’t make the most of the available servers and often require manual intervention during outages, CIOs will ensure that data is actionable wherever it resides, is as close as possible to the end-user for performance, and that the load of data processing is spread across all compute and storage nodes whether it be at the edge, in the data center, or in the cloud.”

The storage industry will start to productize AI and ML 

“AI and Machine Learning have so much promise, but they’re not being adopted as quickly as anyone in the industry anticipated. There’s a clear reason why: users simply don’t know how to realize the technologies’ full potential. Beyond ChatGPT, which is easy to use and incredibly popular, there’s no real out-of-the-box product for enterprise storage customers. So unless organizations have a data scientist on hand to help them navigate the intricacies of AI and ML, they’re very likely to hold off when it comes to implementing any kind of solution.

This presents a great opportunity for the storage industry and the smart companies are already starting to think about it. Through 2024, we’ll see the beginning of the productization of AI and ML. Ready-to-use packages will be developed so that users can easily understand what the technologies can help them achieve, while being straightforward to set up and run. Then watch, as AI and ML adoption increases.”

Virtual Desktop Infrastructure is here to stay – but much will move back on-premise

“When Covid hit, VDIs were the reason many of us could continue to work. They offered users a flexible, consistent experience from wherever they logged in and became a lynchpin for organizations during the days of lockdown. But there was an issue: the hardware was difficult to get hold of. And the urgency we all became so used to during the pandemic meant there was no time to wait for the supply chain to right itself, so CIOs turned to the cloud. 

Don’t get me wrong, the cloud has clear benefits. It is easy to implement, and it is elastic in nature, quickly responding to and growing with our needs. But it can be very expensive and, because cloud providers tend to charge for each transaction, costs can be difficult to predict. Availability in the supply chain, will bring about a shift towards migrating highly transactional workloads back on-premise. Unhappy with writing blank checks, CFOs will rightly start to ask CIOs to demonstrate ROI and explain the cost difference between cloud and on-premise.”

JB Baker, Vice President of Marketing & Product Management at ScaleFlux

Sustainable Data Storage Becomes a Priority

“With sustainability rising as an urgent priority across industries, data storage solutions will be under increasing pressure to reduce their environmental impact. Organizations are ramping up investments in energy-efficient technologies to meet emissions requirements and goals. Data storage, projected to account for 14 percent of the global carbon footprint by 2040, will be a key focus area.

To minimize the footprint of the data center, storage leaders will need to look beyond device-level specs and take a solution-wide view. The criteria will expand to encompass data compression, energy expenditure, workload optimization, and more. The goal is to maximize efficiency and minimize power consumption across the storage infrastructure. As sustainability becomes a competitive differentiator, we will see rapid innovation in “green” data storage technologies, architectures, and management techniques. The storage domain will play a critical role in driving the sustainability transformation.”

Anand Babu, Co-Founder & CEO at MinlO

Unstructured data becomes a core enterprise challenge

“Over the last few years, we have seen explosive growth in the semi-structured data world (log files, models, snapshots, artifactory code) which has, in turn, driven the growth of object storage.”

“In 2024, we’ll see an enterprise explosion of truly unstructured data (audio, video, meeting recordings, talks, presentations) as AI applications take flight. This is highly ‘learnable’ content from an AI perspective and gathering it into the AI data lake will greatly enhance the intelligence capacity of the enterprise as a whole, but it also comes with unique challenges.”

“There are distinct challenges with maintaining performance at 10s of petabytes. Those generally cannot be solved with traditional SAN/NAS solutions — they require the attributes of a modern, highly performant object store. This is why most of the AI/ML technologies (I.e. OpenAI, Anthropic, Kubeflow), leverage object stores and why most databases are moving to be object storage centric.”

Jon France, CISO at ISC2

We’ll see an evolution, rather than a revolution of regulations

“The regulatory landscape will continue to stay hot – I think we’ll see more regulations governing AI and privacy in particular, and we’ll likely see more backlash around reporting requirements and a push for agencies to define what should actually be reported and at what thresholds of materiality. However, I don’t see a major overhaul coming. Instead, I think what we’ll see is sectors grappling with the tangible effects of the requirements that have been introduced. We’re no longer looking at these regulations as being on the horizon…in 2024, they’ll have to be adhered to. With this, I hope to see increased harmonization of regulations globally, so that multinational companies don’t run into navigational issues of not knowing which regulations and policies to follow and which don’t apply. We’re starting to see increased communication on a global scale, but we’re not there yet. It may be wishful thinking, but I predict we’ll see major global powers collaborating on what a cyber secure world should look like, and making policy decisions based on those discussions.”

Mark Cassetta, Chief Product Officer at Axiomatics

With recent legislation, the security market is poised to shift focus

“2023 saw a number of notable startups, especially incorporating Generative AI into their offerings. As fast as enterprises have started to adopt the usage, legislation has been discussed and shared to try to further protect the US identity and economy. This means that in 2024, just as both the mobile (MDM) and cloud platform shifts (CASB) created their security categories, we will see the same very quickly formally emerge for Generative AI.”

Giorgio Regni, CTO at Scality

HDDs will live on, despite predictions of a premature death

“Some all-flash vendors prognosticate the end of spinning disk (HDD) media in the coming years. While flash media and solid state drives (SSDs) have clear benefits when it comes to latency, are making major strides in density, and the cost per GB is declining, we see HDDs holding a 3-5x density/cost advantage over high-density SSDs through 2028.

Therefore, the current call for HDD end-of-life is akin to the tape-is-dead arguments from 20 years ago. In a similar way, HDDs will likely survive for the foreseeable future as they continue to provide workload-specific value.”

End users will discover the value of unstructured data for AI

“The meteoric rise of large language models (LLMs) over the past year highlights the incredible potential they hold for organizations of all sizes and industries. They primarily leverage structured, or text-based, training data. In the coming year, businesses will discover the value of their vast troves of unstructured data, in the form of images and other media.

This unstructured data will become a useful source of insights through AI/ML tooling for image recognition applications in healthcare, surveillance, transportation, and other business domains. Organizations will store petabytes of unstructured data in scalable “lakehouses” that can feed this unstructured data to AI-optimized services in the core, edge and public cloud as needed to gain insights faster.”

Ransomware detection will be the next advancement in data protection solutions

“In recent years, the tech industry has made tremendous strides in protecting data against all manner of threats, including increasingly destructive malware and ransomware. This is exemplified by the rise of immutability in data protection and data storage solutions, especially for backup data.

While data protection and restoration are a major cornerstone that serves as a critical last line of defense in a layered cybersecurity infrastructure, new advancements in AI-generated ransomware detection capabilities will emerge in data protection and storage solutions in 2024.”

Managed services will become key to resolving the complexity of hybrid cloud

“Multi-cloud is a reality today for most enterprises, in their use of multiple SaaS and IaaS offerings from different vendors. However, the use of on-premises and public cloud in a single application or workload has become mired in the complexities of different application deployment models and multiple vendor APIs and orchestration frameworks.

While this has inhibited the powerful agility and cost-reduction promises of the hybrid-cloud model, throughout the coming year, organizations will increasingly leverage the experience and skills of managed service providers (MSPs) to solve these complexity issues and help them achieve business value and ROI.”

Amer Deeba, CEO and Co-Founder at Normalyze

SEC Regulations Will Impact The One Area We Don’t Want to Talk About: Your Data 

“As we know, the new SEC transparency requirements and ruling now requires public companies to disclose cybersecurity posture annually and cyber incidents within four days after determining an incident was material. In 2024, this major policy shift will have a significant effect on one key area: data, forcing businesses to think about security with data at the forefront. In response, enterprises will dedicate both effort and budget to support the SEC’s data-first strategy – implementing best practices that assure shareholders that their company’s most valuable asset – data – is protected. In 2024, companies will need to discover where their data resides and who can access it, while proactively remediating risks that have the highest monetary impact in the event of a breach. When faced with this dilemma companies will lean on automation, specifically end-to-end, automated solutions that center on a holistic approach.

The recent ALPHV/Black Cat and MeridianLink breach underscores the importance for businesses of understanding exactly what data they have, where it lives, and how it is protected. In order to answer critical questions with confidence in the event of a breach and lower the probability of a breach, companies need to build better defenses. The risk of exposure/tagging is not novel, but with these new disclosure requirements, securing the target of such attacks – the data – has gone from a good first practice to an absolute necessity.  Being proactive means that if a breach does occur, you can respond quickly, answer these critical questions, be in compliance with the SEC requirements, and most importantly — respond. To summarize, in 2024 we’ll see organizations separated by their approach to data security. With these regulations, there is no alternative. Organizations must effectively remediate risks to lucrative sensitive data before breaches occur. Only this will allow organizations to respond decisively and confidently if an incident occurs.” 

To Address the Influx of Data, Security Teams Must Approach Data Security Like a Team Sport

“As AI booms, the industry is facing increasing complexity and an influx of data, and companies are grappling with how to keep it all secure. In the height of AI technology adoption, companies will need to refocus in 2024 on what matters most – protecting their data as it gets used by machine learning modes and new AI technologies. Businesses need to change their approach: the success of the coming year for organizations big and small will come back to how they do so. The challenges that this will bring require the profound depth and efficiencies of AI and automated processes to ensure protection of cloud-resident sensitive data. As demands around data change in 2024, organizations will need to invest in their security and cloud ops teams, approaching data security like a team sport, building more efficient shared responsibility models to better protect data. These teams can then regain visibility of all data stores within an enterprise’s cloud or on-premises environment and trace possible attack paths, overprovisioned access, and risks that can lead to data exposure. Only by identifying the approach to data, ensuring permissions and privileges and efficiently implementing AI will companies enable their teams to be successful in 2024.” 

Raul Martynek, CEO at DataBank

The AI cloud wars between hyperscalers will take center stage

“With Google’s latest investment in Anthropic, together with Microsoft’s stake in OpenAI as well as Nvidia’s support for GPU-as-a-service players like CoreWeave, we are beginning to see the emerging outlines of a new phase of competition in the public cloud driven by differentiated AI GPU clouds. In 2024, these new competition dynamics will take center stage as big tech seeks to outcompete each other in the race to realize artificial general intelligence. Nvidia will emerge as a giant competing on the same level as the ranks of Google, Microsoft and AWS. With its cutting-edge GPUs, I see Nvidia emerging as a very capable threat to big tech’s dominance in the public cloud space.”

Miroslav Klivansky, Global Practice Leader at Pure Storage

“In 2024, we will start to creep into GenAI’s trough of disillusionment (Gartner’s Hype Cycle defines this as a period when interest wanes when experiments and implementations fail to deliver) and will eventually industrialize the use of AI.  As we shift from the hype brought on by AI tools with a consumer-friendly UX, we’ll see companies better understand, invest, and apply AI-specific solutions to their business needs.”

“In 2024, we can expect AI to optimize energy efficiency across energy-hungry industries (e.g., manufacturing) as it will be integral in optimizing the process and money savings. Deploying LLMs for inference at scale will also lead to surprisingly high power bills, leading companies to review their data center efficiency strategies and ESG initiatives.”

“One of the industries most ripe for innovation with the help of AI is healthcare. Not only does it have the potential to improve diagnostics, but it also can improve medical devices and automate administrative tasks. The latter will likely be disrupted first because these systems are electronically managed and quickly automate tasks.”

The rate of AI innovation will slow down in the next year

“Over the last several years, AI innovation has been fueled by information sharing and open-source development. However, as companies increasingly invest in AI to give them a competitive edge and regulatory bodies seek to unpack the potential around AI’s broader impact, companies will likely be more aggressive when it comes to protecting their IP.”

Kurt Markley, Managing Director, Americas at Apricorn

Cyber Resilience

“The rapid growth of AI is helping bad actors more quickly create and deploy ransomware tools across a host of industries. It’s been reported that generative AI has helped to double ransomware attacks against industries such as healthcare, municipalities and education between August 2022 and July 2023. Also concerning is the rate at which organizations choose to pay a ransom in order to secure their data. One research report shows that nearly half of respondents have a security policy in place to pay a ransom, with 45% admitting that bad actors still exposed their data even after paying the ransom.

Ransomware isn’t a threat; in many instances it’s an inevitability. No data is too low-value and no organization is too small. The alarmingly high rate of paying a ransom and still having data exposed means that IT leaders have to take back control and put practices in place to protect their data and save their capital budget. It means that IT leaders can’t afford to slack off regarding cyber resilience. 

While almost all IT leaders say they factor in data backups as part of their cyber security strategies, research we conducted earlier this year found that only one in four follow a best practice called the 3-2-1 rule, in which they keep three copies of data on two different formats, one of which is stored offsite and encrypted. Furthermore, this same research found that more than half of respondents kept their backups for 120 days or less, far shorter than the average 287 days it takes to detect a breach.

The likelihood that AI-driven ransomware will impact far-higher numbers of organizations, it will be more important than ever in 2024 that organizations have a strong cyber resiliency plan in place that relies on two things: encryption of data and storage of it for an appropriate amount of time. IT leaders need to embrace the 3-2-1 rule and must encrypt their own data before bad actors steal it and encrypt it against them.”

Data Management Within Security Policy

“Data is no longer a byproduct of what an organization’s users create; it is the most valuable asset organizations have. Businesses, agencies and organizations have invested billions of dollars over the past decade to move their data assets to the cloud; the demand is so high that Gartner expects that public-cloud end user spending will reach $600B this year. These organizations made the move to the cloud, at least in part, because of a perception that the cloud was more secure than traditional on-prem options.

It’s estimated that 30 percent of cloud data assets contain sensitive information. All that data makes the cloud a juicy target and we expect that 2024 will continue to show that bad actors are cunning, clever and hard-working when it comes to pursuing data. The industry has seen triple the number of hacking groups attacking the cloud, with high-profile successes against VMware servers and the U.S. Pentagon taking place this year.

As IT teams spend more on moving and storing data in the cloud, organizations must spend the next 12 – 24 months auditing, categorizing and storing it accordingly. They need to gain deeper visibility into what data they have stored in the cloud, how data relates to each other, and if it is still meaningful to the operations of the organization. In doing so, they are advised to create specific security policies about how, where and for how long they store their data. These policies, when actively enforced, will help organizations better protect their most valuable asset – their data.”

Brian Land, VP of Global Sales Engineering at Lucidworks

Navigating the Privacy Terrain

“In 2024, brands are gearing up to face new challenges around privacy and ethics with the end of third-party cookies and the advent of new large language models (LLMs). This means they’ll be shaking things up in how they market and handle consumer data privacy. For example, they’ll have to find new methods for collecting user data and be more transparent about how they’re collecting that data. And when it comes to managing LLMs, they will adopt advanced encryption and secure data storage practices to safeguard user information. Rest assured, they’re working hard to get it right – making sure they follow the rules while still keeping consumers engaged and happy.”

Matt Watts, CTO at NetApp

There’s No Such Thing as Perfection

“If your company thinks the cloud will ease every IT woe your team is experiencing, you’ll want to think again. It won’t and it can’t. Migrations in hybrid multicloud environments strain both budgets and team resources and you’ll need to find ways to optimize operations both as you move to the cloud and every day thereafter. According to a recent report on data complexity, approximately 75 percent of global tech executives in the throes of cloud migration note they still have a sizable number of workloads remaining on-premises (between 30 percent and 80 percent). For most companies, maintaining an increasingly complex IT infrastructure will remain challenging as cost pressures mount alongside demands for greater innovation. In 2024, we’ll see companies abandon unrealistic ideas of creating the “perfect” cloud environment as they move toward an intelligent data infrastructure (IDI). IDI is the union of unified data storage with fully integrated data management that delivers both security and observability with a single pane of glass interface so you can store, control, and use data more easily no matter what applications, cloud services, or databases you’re using. Companies that choose IDI will experience greater agility in adapting to market conditions. With a more nimble infrastructure, IT can spend its time on innovation, skill building, and development that align with business priorities rather than simply maintaining their cloud environments.”

The Uncomfortable Truth: You’ve Already Had a Breach

“Today’s cyber threat landscape requires constant vigilance as you try to guess who, when, and how the next bad actor will attack. Whether an employee clicks on the wrong link, or an organized gang of cyber criminals are the culprit, you’ll need to have the right tools to quickly alert you of an attack so you can recover quickly. And, while preventing attacks is always the goal, the ability to keep bad actors out indefinitely is now a statistical anomaly. In fact, it’s predicted that by 2031 ransomware attacks will occur every 2 seconds at a cost of $265 billion each year. Because of this, 87 percent of C-suite and board-level executives see protecting themselves from ransomware as a high, or top, priority. And stolen data isn’t your biggest concern after an attack. It’s the lost productivity and business continuity as systems are repaired and data restored to get your business up and running again. In 2024, we’ll see more investment in IT security that ensures systems are secure by design and keep business to a minimum when there is an attack. Security infrastructures that include immutable data backups will add to peace of mind and mitigate downtime when cyber incidents are investigated.”

James Beecham, Founder and CEO at ALTR

While AI and LLMs continue to increase in popularity, so will the potential danger

“With the rapid rise of AI and LLMs in 2023, the business landscape has undergone a profound transformation, marked by innovation and efficiency. But this quick ascent has also given rise to concerns about the utilization and the safeguarding of sensitive data. Unfortunately, early indications reveal that the data security problem will only intensify next year. When prompted effectively, LLMs are adept at extracting valuable insight from training data, but this poses a unique set of challenges that require modern technical solutions. As the use of AI and LLMs continues to grow in 2024, it will be essential to balance the potential benefits with the need to mitigate risks and ensure responsible use. 

Without stringent data protection over the data that AI has access to, there is a heightened risk of data breaches that can result in financial losses, regulatory fines, and severe damage to the organization’s reputation. There is also a dangerous risk of insider threats within organizations, where trusted personnel can exploit AI and LLM tools for unauthorized data sharing whether it was done maliciously or not, potentially resulting in intellectual property theft, corporate espionage, and damage to an organization’s reputation.  

In the coming year, organizations will combat these challenges by implementing comprehensive data governance frameworks, including, data classification, access controls, anonymization, frequent audits and monitoring, regulatory compliance, and consistent employee training. Also, SaaS-based data governance and data security solutions will play a critical role in keeping data protected, as it enables organizations to fit them into their existing framework without roadblocks.”

With increased data sharing, comes increased risk

“Two things will drive an increased need for governance and security in 2024. First, the need to share sensitive data outside of traditional on-premise systems means that businesses need increased real-time auditing and protection. It’s no surprise that sharing sensitive data outside the traditional four walls creates additional risks that need to be mitigated, so next year, businesses need – and want – to ensure that they have the right governance policies in place to protect it. 

The other issue is that new data sets are starting to move to the cloud and need to be shared. The cloud is an increasingly popular platform for this, as it provides a highly scalable and cost-effective way to store and share data. However, as data moves to the cloud, businesses need to ensure that they have the right security policies in place to protect data, and that these policies are being followed. This includes ensuring that data is encrypted both at rest and in transit, and that the right access controls are in place to ensure that only authorized users can access the data. 

In 2024, to reduce these security risks, businesses will make even more of an effort to protect their data no matter where it resides.”

Rodman Ramezanian, Global Cloud Threat Lead at Skyhigh Security

Data Security and Privacy Concerns

“Organizations are increasingly concerned about the security and privacy of their data in the cloud. On-premises infrastructures tend to give organizations more control over their data.”

Andrew Hollister, CISO & VP Labs R&D at LogRhythm

Generative AI adoption will lead to major confidential data risks

“The cybersecurity landscape will confront a similar challenge with generative AI as it did previously with cloud computing. Just as there was initially a lack of understanding regarding the shared responsibility model associated with cloud computing, we find ourselves in a situation where gen AI adoption lacks clarity. Many are uncertain about how to effectively leverage gen AI, where its true value lies, and when and where it should not be employed. This predicament is likely to result in a significant risk of confidential information breaches through gen AI platforms.”

Angel Vina, CEO & Founder at Denodo

Organizations Will Need to Manage Cloud Costs More Effectively

“As businesses continue to shift data operations to the cloud, they face a significant hurdle: the relentless, unsustainable escalation of cloud data expenses. For the year ahead, the mandate is not just to rein in these rising costs but to do so while maintaining high-quality service and competitive performance. Surging cloud hosting and data management costs are preventing companies from effectively forecasting and budgeting, and the previously reliable costs of on-premises data storage have become overshadowed by the volatile pricing structures of the cloud.

Addressing this financial strain requires businesses to thoroughly analyze cloud expenses and seek efficiencies without sacrificing performance. This involves a detailed examination of data usage patterns, pinpointing areas of inefficiency, and a consideration for more cost-effective storage options. To manage cloud data costs effectively, firms need to focus on the compute consumed by queries and the associated data egress volumes, tabulating the usage of datasets, and optimizing storage solutions. These efforts are enhanced by adopting financial operations (FinOps) principles, which blend financial accountability with the cloud’s flexible spending model.

By regularly monitoring expenditures, forecasting costs, and implementing financial best practices in cloud management, organizations can balance cost savings and operational efficacy, ensuring that their data strategies are economically and functionally robust. In 2024, we will see a significant rise in the use of FinOps dashboards to better manage cloud data charges.”

Kevin Keaton, CIO/CISO at Red Cell Partners

Shifting Cyber Regulations Will Change the Status Quo

“The new SEC rules on cyber that go active in December for public companies are and will continue to cause significant changes in how boards operate with regards to cyber risks – and I expect that navigating these rules will be the biggest cybersecurity challenge businesses will face in 2024.”

Eric Herzog, Chief Marketing Officer at Infinidat

Triple play of cyber resiliency, detection, and recovery to create an overall corporate cybersecurity strategy 

“The convergence of cyber resilience, detection, and recovery on a single storage platform is fueling a trend for 2024 around higher levels of cybersecurity for enterprise storage. Reliance solely on backup is no longer enough to secure storage systems. Primary storage has become a main target of cybercriminals for the most insidious and hard-to-detect ransomware and malware attacks that wreak costly havoc on enterprises. Combining resilience (the ability to instill defensive security measures to repel attacks), detection (the ability to know when data is corrupted and whether a known good copy of data is free of ransomware or malware), and recovery (the ability to bounce back) from cyberattacks is the key to hardening storage infrastructure.

This trend to better secure storage systems is highly important because of the continued exponential increase in cyberattacks against enterprises of all types and in all industries.  Cybercrime is predicted to grow from $8 trillion worldwide in 2023 to more than $10 trillion in 2025. Cybercriminals attempted nearly 500 million ransomware attacks last year, marking the second-highest year ever recorded for ransomware attacks globally, and in the 2023 Fortune CEO survey of “Threats” to their companies, CEOs named cybersecurity their #2 concern. Ransomware attacks also represented 12 percent of breaches of critical infrastructure in the last year.

The convergence of cyber resilience, detection, and recovery on an integrated storage platform is an advancement over the past, commonly-used approach of disparate tools and technologies trying to combat cyberattacks in silos. Improving the security defenses of cyber storage for enterprises eliminates the vulnerabilities of the silos. It makes the cyber capabilities more air-tight and ensures a rapid recovery of data within minutes to thwart cybercriminals, nullifying ransom demands and preventing (or minimizing) any downtime or damage to the business. Ignoring this trend in 2024 could greatly harm an enterprise, especially one that doesn’t even know cybercriminals are lurking in their data infrastructure, no matter how good their other cybersecurity defenses are.”

Stacy Hayes, Co-Founder and EVP at Assured Data Protection

More channel players to use specialists for managed services 

“The managed services model will become increasingly attractive to traditional VARs in 2024, especially with more and more businesses looking to buy cloud and IT services on a usage basis. But making the transition from a traditional VAR to a provider of managed services is easier said than done. It’s not that VARs aren’t capable of diversifying, far from it, it’s just that the switch requires a fundamental shift in the way VARs do business and that isn’t something you can just change overnight. These large organizations are not built for this new world model. The in-house build and integration of new technology and go-to-market models takes too long and is too expensive to implement. VARs simply don’t have the people, the flexibility or the know how. With the economic headwinds as they are, Opex is king and no-one has the Capex or the appetite for big in-house builds. 

It is becoming increasingly difficult for VARs to provide a large portfolio of products and services to the standards customers demand. The speed the market moves, the reliance on data, all add to greater demands from customers. It is evident channel businesses are struggling to deliver what their customers want, whether it be on-premises or in the cloud. It is a common topic and one I believe means VARs need to clearly understand what they can deliver themselves, and what they need to outsource. Outsourcing, white labelling, is a great way to deliver a high quality and diverse portfolio to customers. 

MSPs that have the know-how to use utility based models effectively, that can execute immediately, have experts in the space and deliver services tailored for the vendor, customer, end user will be the partners of choice for VARs in 2024.” 

Brian Dutton, Director, US Sales and Client Services at Assured Data Protection

More businesses to spend upfront for managed services to beat inflation 

“Businesses are becoming more cost-conscious as prices for cloud and SaaS services keep rising in line with inflation. Every time the large vendors and hyperscalers pass on costs to the customer, company CFOs and finance directors find themselves asking IT the question, ‘where can we cut costs?’ This is creating a dilemma for IT teams, who are left wondering how do they keep the lights on and execute new digital and cloud strategies, on a smaller budget? Which is why so many have switched to an OPEX model that covers core capabilities, including DR and backup, based on a consumption model that is paid for in monthly installments. It has allowed them to cut CAPEX, operate on a per TB model as opposed to wasting valuable data center resources, and focus their efforts on business priorities. 

The impact and cost savings are tangible, but it’s also thrown a lifeline to SMBs and government organizations that simply don’t have the budget or infrastructure to support investment in new DR and backup solutions. The managed service option has become the preferred choice for large enterprises that have to prioritize transformation projects, and SMEs, local schools and municipalities with budget limitations. We expect more businesses to adopt the utility-based model that managed service providers offer for cloud-based data management. It lightens the load on teams, while reducing risk and guaranteeing uptime and business continuity in the event of a disaster, data breach or ransomware attack. Another byproduct of this trend we’ve experienced is companies prepared to pay for services upfront, locking costs in for up to 6-12 months, or longer, to protect themselves against inflation. This makes financial sense, especially if you’re cash rich now and want to ensure your data is protected over the long term when market volatility can affect prices elsewhere. We expect this to become the norm next year and the foreseeable future.” 

Andrew Eva, Director, CIO at Assured Data Protection

Scope three emissions compliance set to drive uptake of disaster recovery managed services 

“Sustainability is an issue that impacts every part of the economy and increasingly, the technology sector is being held to account for its carbon emissions. Until recently, organizations have mostly had to concern themselves with two key emission calcifications: scope one – emissions the organization is directly responsible for, and scope two – indirect emissions, such as electricity. Now though, we’re seeing the impact of scope three emissions being felt. That is, all other emissions associated with an organization’s activities, including its supply chain. While scope three emissions aren’t yet legally enforceable, they are being widely adopted by large organizations, as legislation is inevitable and there’s a widespread desire to get ahead of the issue. We’re now seeing their impact filter down to smaller organizations.  

This is an issue for the DR sector and organizations that are leaders in sustainability – they are recognizing the challenge and the value of outsourcing this function to an MSP. By eliminating the need for data backup via a second site, which are costly to operate, don’t always utilize the latest power efficient hardware, and are responsible for significant carbon emissions, ESG compliance is a lot more manageable. There’s also recognition that this isn’t simply offloading the problem because MSP DR solutions achieve economies of scale by servicing multiple organizations via a shared facility, making them carbon-efficient for customers. Given the rate at which scope three is permeating, we expect to see more organizations adopt outsourced DR services. Both existing and future and existing business for MSPs depends on helping customers and partners achieve ESG compliance.”

Eric Herzog, Chief Marketing Officer at Infinidat

Freeing up money from storage for AI and other critical IT projects 

“Dramatically reducing the costs of enterprise storage frees up IT budget to fund other significant new projects, such as AI projects, cybersecurity-related projects, or other strategic activities. This trend for 2024 will play a pivotal role in enterprises where there will be pressure to accelerate AI projects for the next stage of digital transformation, as well as to improve cybersecurity against more sophisticated AI-driven cyberattacks. With IT budgets projected by Gartner to grow by 8 percent in 2024, funding for these new projects will need to come from somewhere.

A smart approach to shifting IT spending internally that is taking hold is to remove costs from storage, while simultaneously improving storage. It sounds like a paradox at first sight, but the trend in 2024 will be to take advantage of three key factors that make this “paradox” an actual reality: (1) storage consolidation onto a single, scalable high availability and high performance platform, (2) autonomous automation, and (3) pay-as-you-go, flexible consumption models for hybrid cloud (private cloud and public cloud) implementations of storage. 

Storage consolidation, for example, replaces 20 storage arrays with one or two storage arrays that can scale into the multi-petabyte range with 100% availability guaranteed. Having fewer arrays immediately lowers costs in terms of IT resource and storage management, power, cooling, and space. This cost savings can be used for critical IT projects. Autonomous automation simplifies storage, intelligently automating processes and how to handle applications and workloads. Virtually no human intervention is needed. The storage systems run themselves, enabling a set-it-and-forget-it mode of operations. IT staff can focus on more value-adding activities and building AI capabilities into the infrastructure and across the enterprise. Leveraging flexible consumption models to pay for storage, as needed and as efficiently as possible, lowers CAPEX and OPEX, freeing up money for these other IT projects. An extension of this trend is also to invest in enterprise storage solutions that deliver an ROI in one year or less, optimizing the budget.”

The blossoming of green storage 

“Having more storage capacity in the same form-factor and having fewer storage arrays have become part of an ongoing trend to make enterprise storage, along with data centers in general, more environmental-friendly. Fewer arrays mean less carbon footprint, less to cool with the use of coolants, and less to recycle, translating into a much lower impact on the environment. Furthermore, consolidating multiple arrays into a single storage platform means the use of less energy, fitting a strategy to advance sustainability. Storage upgrades also bring energy savings, which also translates into cost savings, in light of the rising costs of energy this year. Consolidation also means much improved, more efficient utilization of space.

With the higher cost of energy, the need to reduce floorspace costs, the drive to lower carbon emissions, and the desire to reduce the impact of recycling of storage arrays on the environment, this trend will see in the new year an increase of audits of enterprise storage infrastructure and more intense identification of inefficiencies and waste in the storage estate.

The blossoming of green storage will be demonstrated in 2024 by reduced energy consumed to power storage systems, while still protecting data. We’ll see bigger capacity systems being installed that take up less space than traditional arrays. Software-defined storage increases storage utilization and reduces overprovisioning of storage. Also, part of green storage will be managing data as part of a data lifecycle strategy for more agility and better compliance with sustainability standards.

As part of broader green IT initiatives, storage will come under greater scrutiny in 2024 to boost efficiency and conservation. Enterprises will increasingly turn to AI for the capabilities to optimize storage capacity and streamline management, resulting in more efficiency. Gartner predicts that by 2025 half of all data centers will deploy AI/ML to increase efficiency by up to 30%. AI will also be used to optimize cooling. More enterprises that use water-based cooling systems to cool their data centers where storage systems reside will be compelled to get on a path to become “water positive,” replenishing more water than they consume. Green IT is changing the way storage administrators need to think about the future of enterprise storage.”

Seamless hybrid cloud integration

“The shift to hybrid cloud in the enterprise has been ongoing for years, as enterprises figured out that a balanced approach to leveraging the public cloud and maximizing on-premises private cloud data infrastructure makes the most business sense. But what is new about this trend is how dominant the hybrid cloud has become for enterprise storage. New capabilities have made it extremely easy for enterprises to manage on-prem private cloud storage and the public cloud storage as one, integrated, software-defined infrastructure, as if the public cloud is just another “array” identified in the software’s user interface. Advancements in the past year to make managing hybrid cloud storage even simpler has unlocked the full-scale embracing of this approach to storage, especially for large enterprises.

Hybrid cloud has also become easier to scale. If a large enterprise needs to expand capacity quickly because of an unexpected burst in data traffic, it can scale fast on a single, software-defined, multi-petabyte storage system that is on-prem. It can obtain a cloud-like experience, yet leveraging its own infrastructure, without any downtime or complexity. At the heart of why hybrid cloud is so strong, and so attractive, are cost and control. Enterprises can keep costs lower by using an on-prem storage system after storage consolidation of many arrays into one or two systems. They also don’t have to pay the hidden costs of moving data back and forth from the public cloud. They can use the public cloud for the use cases that are most appropriate, such as archived data, backup data, disaster recovery, or DevOps. Simultaneously, enterprises keep better control of their data by having it on-prem, ensuring compliance, especially with recent regulations about data governance and data privacy.”

Opening up the potential of containers in the hybrid cloud data center

“It’s estimated that approximately 90 percent of global organizations will be running containerized applications in production by the year 2026, according to Gartner analysts – up from 40 percent just two years ago. By 2026, it’s estimated that as much as 20% of all enterprise applications will run in containers, more than doubling the percentage from 2020. The adoption and expanded usage of containers are definitely on the upswing. This multi-year trend is gaining momentum for 2024 because of the increasing need for enterprises to innovate at a faster rate today than ever to meet evolving customer expectations. The point is that enterprises are becoming more digitally-focused in order to better compete.

Containers, which exemplify a cloud-native approach, provide a cost-efficient way to automate the deployment of modern applications – and do it at scale – while making them portable, environment-agnostic, and less resource-dependent to achieve cost savings. Consequently, the rate at which new applications are being developed is monumental. IDC reported that by the end of 2023, roughly 500 million new “logical applications” will have been created – a number that is the equivalent to the amount of applications developed over the past four decades in total. 

According to the latest available Cloud Native Computing Foundation (CNCF) survey, 44% of respondents are already using containers for nearly all applications and business segments and another 35 percent say containers are used for at least a few production applications.

More attention is being paid to thinking about the infrastructure – particularly the enterprise storage infrastructure – that supports containers. Storage is critical in the world of containers. The challenge with this trend is to pick the right enterprise storage, especially with enterprises globally needing to scale container environments into petabytes.

CSI is the standard for external primary storage and backup storage for container deployments, and it is becoming the default for Kubernetes storage environments and other container-run types. Applications, workloads, and environments are transforming around Kubernetes. It’s in your favor to work with an enterprise storage solutions provider that aligns to the CSI standards. The container world is moving so quickly and the versions of Kubernetes and the associated distributions get updated every three to six months. Kubernetes has emerged as the de facto standard for container orchestration.”

Skills gap in storage calls for an increase in automated storage

“A skills gap across the data center and, particularly, in enterprise storage has emerged, and the trend is that this gap is getting wider. Fewer IT professionals are choosing to specialize in storage yet capacity trends to grow in the enterprise exponentially. While it makes storage administrators more valuable at this time, an increasing number of enterprises are having trouble staffing the roles to manage traditional, legacy storage in support of applications, workloads and the entire data infrastructure. The shortage in qualified IT professionals creates a precarious situation for the future of many enterprises. Therefore, the trend in the enterprise space is to turn to AI-equipped autonomous automation of enterprise storage.

With autonomous automation going mainstream in the enterprise space, CIOs and IT leaders can take a “set it and forget it” approach to storage. They can reduce the risk of the skills gap jeopardizing their ability to have an always-on, reliable storage infrastructure. This approach, of course, requires a very simple user interface for the average IT professional to be able to manage the storage, whether to increase capacity, see insights into the performance of the storage system, or execute a rapid recovery from a cyberattack.  At the same time, use of autonomously automated storage frees up valuable IT headcount to be utilized in other areas of the datacenter and enterprise software environments.

The increase in cyberattacks has exposed the skills gap even more because enterprise storage has become the new frontier for pressure-testing the merging of cybersecurity and cyber resilience. If storage continues to be done in the traditional ways, enterprise will continue to be held ransom by bad actors unleashing ransomware. However, by automating cyber detection with ML-tuned algorithms, the skills gap gets plugged with new capabilities, making the normal IT person look like a “storage superstar.” The trend is toward self-directed, self-adjusting autonomous automation of enterprise storage to reduce the risks associated with any major skills gap. “

Redefining the user experience for enterprise storage

“For enterprise storage, user experience is no longer only about the graphical user interface (GUI).  While the GUI is still important and should be as easy to use as possible, the scope of user experience has broadened to include essential elements for today’s day and age: guaranteed service level agreements (SLAs), white glove service, and professional services. Enterprises are not only looking for a box on which to run their applications and workloads; they are increasingly looking for excellence in service and support to be built into their storage solution.

The trend is for enterprises to opt for the storage vendor that offers the best support, best SLAs, and best proactive professional services, yet at a low cost or for no charge. It redefines the expectations for user experience. Enterprises want to know they have an advocate within the storage provider who can lend their expertise to solve challenges for the customer. They want to know they can get L3 support direct on a moment’s notice. They want to have confidence in the integration capabilities of a professional services team.

They see this all as added value, and it has become either a dealmaker or a deal-breaker for many enterprises that are reevaluating their choice of storage solution vendor based on this criteria. User experience has been transformed into the total experience for the customer.”

Graham Russell, Market Intelligence Director at Own Company

More organizations will embrace the ongoing adoption of cloud and SaaS

“The SaaS revolution is turning industries into tech playgrounds: from healthcare to finance, the widespread embrace of Software as a Service applications has been a game-changer. But while SaaS may already seem ubiquitous, in reality many organizations have been slow to embrace it. 2024 is the year they will likely catch on. This shift will result in a broader market for SaaS providers, opening up new opportunities for growth and innovation. And with this increased adoption comes a greater need for data protection and cybersecurity measures. As organizations entrust mission-critical information to SaaS applications, the potential consequences of data loss and corruption become more significant.”

AI adoption will drive data breaches  

“As the adoption of AI continues to skyrocket, the risk of data breaches increases. The sophistication and reach of AI can inadvertently expose vulnerabilities in cybersecurity defences, making organizations more susceptible to malicious attacks and unauthorised access.

This inevitable intersection of AI and data breaches is set to redefine the data protection and cybersecurity landscape in the near future. The silver lining? It will propel a renewed and intensified focus on data security issues. With each headline-grabbing breach, businesses are becoming increasingly vigilant about the safety of their business data. Organizations will be more focused than ever on being compliant with – and demonstrating compliance with – regulatory standards.”

AI adoption will prompt greater focus on data hygiene

“As the adoption of AI continues its rapid ascent, the spotlight on data hygiene is poised to become even more intense. AI’s voracious appetite for high-quality, accurate data makes the concept of data cleanliness a critical factor in unleashing the true potential of AI applications.

In response to this need for impeccable data, a notable trend is the strategic use of backup files. Traditionally seen as a safety net for data recovery, backup files are now being leveraged as a valuable resource for training and refining AI and machine learning models. These files, enriched with historical and real-world data, serve as a goldmine for organizations looking to enhance the depth and breadth of their AI algorithms.

Incorporating backup files into AI and machine learning models allows organizations to simulate diverse scenarios, ensuring that the algorithms are robust and adaptable to real-world complexities. This approach not only optimises the performance of AI applications but also enhances the accuracy of predictions and decision-making processes.”

Organizations will pivot to a ‘platform of choice’ at the core of their tech stacks

“In 2024, organisations will strategically opt for a ‘platform of choice’ that will serve as the centre of their tech stack. This shift will help businesses move away from the fragmented approach of using multiple vendors and applications, towards a more streamlined and integrated tech ecosystem. As a result, organizations will go ‘all in’ with a platform and seek to use applications that are built on that platform to achieve greater efficiency and cost savings. This ‘platform of choice’ approach will go beyond a mere technological preference. As organizations prioritise applications built natively on the chosen platform they are hoping to also minimise the number of vendors and applications in their tech stacks, streamline their workflows and gain increased negotiating power and potentially lower costs associated with managing multiple solutions.”

Seth Batey, Data Protection Officer and Senior Managing Privacy Counsel at Fivetran

The pendulum for build versus buy is going to swing back to buy in 2024

“IT and data management will play a crucial part in bolstering ESG programs in 2024. While ESG and the criteria for each prong, including what investors or customers look for, may differ, strong retention policies support a company’s effort to satisfy the “E” prong, and strong privacy practices support the “S” prong.  Having strong IT and data management, including robust data classification and inventory is necessary to implement retention policies and other privacy safeguards that can be easy wins for an ESG program.”

Steve Stone, Head of Rubrik Zero Labs at Rubrik

The accelerating data explosion will force a security strategy rethink

“In 2024, organizations will face a stiffer challenge in securing data across a rapidly expanding and changing surface area. One way they can address it is to have the same visibility into SaaS and cloud data as they have in their on-premises environments–in particular with existing capabilities. And that will be a major cybersecurity focus for many organizations next year. More will recognize that the entire security construct has shifted – it’s no longer about protecting individual castles but rather an interconnected caravan.”

AI will dominate the cybersecurity conversation

Both attackers and defenders will step up their use of AI. The bad guys will use it more to generate malware quickly, automate attacks, and strengthen the effectiveness of social engineering campaigns. The good guys will counter by incorporating machine learning algorithms, natural language processing, and other AI-based tools into their cybersecurity strategies.

I believe there is almost no scenario where AI-driven deepfakes won’t be part of the pending U.S. Presidential election amongst others. Even if the technology isn’t applied, its within the realm AI deepfakes will be blamed for gaffes or embarrassing imagery.

We’ll also hear more about the role AI can play in solving the persistent cybersecurity talent gap, with AI-powered systems taking over more and more of the routine operations in security operations centers. 

When it comes to cybersecurity in 2024, AI will be everywhere.”

CISOs (and others) will feel pressure from recent government actions

“In late October, the Securities and Exchange Commission announced charges against SolarWinds Corporation, which was targeted by a Russian-backed hacking group in one of the worst cyber-espionage incidents in U.S. history in 2019, and its chief information security officer, Timothy G. Brown. The complaint alleged that for more than two years, SolarWinds and Brown defrauded investors by overstating SolarWinds’ cybersecurity practices and understating or failing to disclose known risks. 

The charges came nearly six months after a judge sentenced Joseph Sullivan, the former CISO at Uber, to three years of probation and ordered him to pay a $50,000 fine after a jury found him guilty of two felonies. Sullivan had been charged with covering up a ransomware attack while Uber was under investigation by the Federal Trade Commission for earlier lapses in data protection.

On top of all that, new SEC rules on cybersecurity and disclosure of breaches were set to take effect Dec. 15. They require public and private companies to comply with numerous incident reporting and governance disclosure requirements.

All of this will have CISOs looking over their shoulder in 2024. As if defending their organizations from bad actors wasn’t challenging enough, now they will have to pay more attention to documenting absolutely everything.”

Brian Spanswick, CISO and CIO at Cohesity

IT leaders will increase their focus on cyber resilience to prepare for 2024’s top security threats

“As attackers and their tools become more sophisticated in the age of generative AI, the need for organizations to be resilient and ensure they can limit business disruption during a cyber event will become even more critical. In turn, organizations will be further investing in cybersecurity fundamentals such as strong asset management practices, system patching, and data encryption.

Quickly recovering core business processes with aggressive recovery time and recovery point objectives significantly minimizes the disruption of a ransomware attack and reduces the leverage the attacker has when demanding payment. This is especially key as organizations must be prepared for the event when, not if, a cyberattack occurs.”

As businesses move their workloads to the cloud, they will need to double down on data security by getting clear visibility of their attack surface – or risk misconfigurations and additional breaches

“Cloud transformation and hybridization are still in progress – and high-risk – for a number of organizations moving away from legacy systems. DSPM is a growing field to help customers manage this risk. In fact, Gartner has issued a report on this entitled “continuous threat exposure management.” Organizations need to fully understand the security implications of the new attack surface created by moving workloads from on-premises to the cloud.”

As more companies implement generative AI, they will face a challenge similar to shadow IT except shadow AI puts proprietary data in the public domain representing a much greater risk

“The challenge is not knowing what algorithms are being used, the data fueling them, and who is using those algorithms. CISOs and organizations will need to ensure transparency and control around the growing use of GenAI. Gartner has called this out in its focus on “AI trust, risk, and security management.”

In 2024, threat actors will continue demanding ransoms despite government agreements to not pay

“A group of nearly 50 countries recently pledged to no longer pay ransoms demanded as part of ransomware attacks. While this represents a diplomatic accomplishment this won’t blunt the frequency or sophistication of attacks on government infrastructure. Attacks are so cheap and easy to launch and the consequences so limited that attackers will continue to probe for weaknesses in an automated, programmatic fashion. Further, nation-state sponsored attackers will seek to sow chaos, rather than financial gain, from attacks.”

Dale Zabriskie, CISSP CCSK, Field CISO at Cohesity

Generative AI and security will come together in the worldwide fight against cybercrime and advanced-persistent threats

“Attackers will be leveraging AI tools to entice employees via social engineering to click and act recklessly, exploit zero-day vulnerabilities, and much more at a much faster rate. We can expect both adversaries and innovative defenders to leverage AI, and it will be a force multiplier in both of their efforts.”

Autonomous and Stateless AI Agents will be effective and efficient as nations and corporations fight off these ever-growing and evolving threats

“The technology world is evolving at a very rapid pace, and with this, the skills gap in emerging technologies is growing much wider than ever before. New tools need to be developed to act as a translation engine between native/natural language and engineering-speak or technical jargon.

To solve this, we are already starting to see the emerging trends of AI Agents – systems that act and reason with a set of predefined tools – to solve more complex situations than traditional RAG architectures. Agent and tool combinations will be leveraged to assist humans in more complex systems management and operational automation.”

Greg Statton, Office of the CTO – Data and AI at Cohesity

The importance of domain-specific bodies of data that are clean, relevant, as well as a means for providing responsible and governed access to that data for use in LLM-based applications, will be paramount

“2023 was the year of the LLM – the bigger the better. Now that all the cloud vendors have selected their LLM provider of choice (or built their own), the world will turn to the importance of domain-specific bodies of data that are clean, relevant, as well as a means for providing responsible and governed access to that data for use in LLM-based applications.

Data will be classified into two main camps:

    • Dynamic Data – this is machine generated data that is served via API or via event streams (think message bus data from live systems). This data is useful when looking at the current state of a thing/object/system/etc

    • Static Data – This is data that lives for a while in a current state. This will mostly be documents that are generated by other knowledge workers. Data that has a shelf life of weeks/months/years.”

Dr. Darren Williams, CEO and Founder at BlackFog

After a record-breaking 2023, we expect that ransomware will not ease anytime soon. Fundamentally, ransomware is becoming the main threat to all organizations, and insurance is no longer a viable option. Action needs to be taken. In 2024 we predict several new trends to take hold

“Ransomware gangs will look for new ways to force victims into paying. We have already seen gangs contact the SEC directly, reporting victims immediately to inflict maximum damage, forcing regulatory, reputational and class action liabilities. We expect this is just the beginning of several new tactics to maximize payouts.”

“Organizations will realize that their existing security is not making any impact on the new threat vectors and will finally start to focus on the core problem, “data security” and “data exfiltration.”

“More than 40 percent of existing data exfiltration goes to China and Russia. We expect other countries such as North Korea to play larger roles in 2024.”

“We expect to see major infrastructure applications become threat vectors for cyber gangs, similar to the way the MOVEit exploit was developed. Hiding in plain sight is going to be the new mantra for cyber gangs as they continue to avoid detection.”

“We expect to see ransomware disrupt major infrastructure through IoT devices and non-traditional platforms. These diverse systems often have limited security designed in and have significant exposure for organizations, particularly in the manufacturing industry.”

Monica Kumar, Chief Marketing Officer at Hitachi Vantara

Recognizing cloud as an operational model

“In 2024, we will see a significant shift in the perception of cloud computing. Gone are the days when all public cloud is good; we will now be looking at cloud as an ecosystem. Cloud will no longer be a fixed location–either on-prem or in the cloud–it’s an operating model that offers cloud principles like agility, self-service, cost-effectiveness, and scalability. This transformation from a location to an operational framework is becoming increasingly clear as more cloud providers begin to leverage solutions that bridge the gap between on-prem and cloud deployments.”

Hybrid cloud sustainability is no longer a luxury; it’s a necessity 

“Business leaders have shown a bigger commitment to reducing their environmental footprint in recent years, with a focus on optimizing resource usage and enhancing the efficiency of data centers. A 2023 study from Hitachi Vantara found that nearly four in five IT leaders and C-level business executives have developed plans for achieving net zero carbon emissions, with 60 percent saying the creation of eco-friendly data centers is a top priority. As businesses rely more on hybrid cloud solutions for their IT needs, these technologies must contribute to a sustainable future. Therefore, in 2024, hybrid cloud sustainability will transition from a “nice to have” strategy to an absolute necessity due to its real implication on the business. 

The shift towards hybrid cloud sustainability will include a range of initiatives. Data center infrastructure and data management practices will be overhauled to reduce unnecessary resource consumption. This may consist of eliminating hot spots and excess energy usage, enhancing cooling systems, and properly removing electrical waste. Businesses will implement strategies to intelligently optimize workloads in their hybrid cloud setups for reduced energy consumption.

This transformation won’t just align with business goals; it will also drastically lower energy costs and streamline data management operations to improve efficiency, protect resources, and substantially curb environmental impact.”

Steve Santamaria, CEO at Folio Photonics

The Rise of Optical Storage in Active Archives

“In 2024, there will be a transformation in how we store and archive data with the emergence of optical storage and an alternative to active archiving systems. This trend will be driven by the growing demand for storage solutions that are not only long-lasting and secure but also energy-efficient. The invention, giving rise to a new generation of Optical storage, will gain traction, especially in sectors where stringent data retention rules are in place, thanks to its durability and resistance to environmental wear and tear. By incorporating state-of-the-art optical storage into active archiving, we’re looking at a viable, environmentally conscious alternative to conventional storage methods, bolstering data access and security. This movement is a testament to the increasing emphasis on both data preservation and environmental stewardship.”

Steve Leeper, VP of Product Marketing at Datadobi

“As artificial intelligence (AI) continues to weave into the fabric of modern business, the year 2024 is likely to witness a surge in the demand for enhanced data insight and mobility. Companies will need to gain insight into their data to strategically feed AI and machine learning platforms, ensuring the most valuable and relevant information is utilized for analysis. This granular data insight will become a cornerstone for businesses as they navigate the complexities of AI integration. At the same time, the mobility of data will emerge as a critical factor, with the need to efficiently transfer large and numerous datasets to AI systems for in-depth analysis and model refinement. The era of AI adoption will not just be about possessing vast amounts of data but about unlocking its true value through meticulous selection and agile movement.

 The trajectory of storage technology is also poised for a significant shift as the year 2024 approaches, with declining flash prices driving a broad-scale transition towards all-flash object storage systems. This shift is expected to result in superior system performance, catering adeptly to the voracious data appetites and rapid access demands of AI-driven operations. As flash storage becomes more financially accessible, its integration into object storage infrastructures is likely to become the norm, offering the swift performance that traditional HDD-based object storage and scalability that NAS systems lack. This evolution will be particularly beneficial for handling the large datasets integral to AI workloads, which necessitate rapid throughput and scalability. Consequently, a data mobility wave may be seen, with datasets and workloads being transferred from outdated and sluggish storage architectures to cutting-edge all-flash object storage solutions. Such a move is anticipated not just for its speed but for its ability to meet the expanding data and performance requisites of burgeoning AI initiatives.

Also importantly, in 2024, the landscape of data management will undergo a profound transformation as the relentless accumulation of data heightens the necessity for robust management solutions. According to Gartner’s projections, by 2027, it is expected that no less than 40% of organizations will have implemented data storage management solutions to classify, garner insights, and optimize their data assets, a significant leap from the 15% benchmark set in early 2023. This trend is likely to be propelled by the relentless expansion of data volumes, outpacing the rate at which companies can expand their IT workforce, thus elevating the indispensability of automation for data management at scale.

2024 is set to be a pivotal time for data management, with a shift towards API-centric architectures for meshed applications gaining traction. As customers increasingly demand that data management vendors offer API access to their functionalities, we are likely to see a mesh of interconnected applications seamlessly communicating with one another. Imagine ITSM (IT Service Management) and/or ITOM (IT Operations Management) software triggering actions in other applications via API calls in response to tickets — this interconnectedness will become commonplace. The trend towards API-first strategies will likely accelerate, driven by the desire to embed data management more integrally within the broader IT ecosystem. As a result, the development of self-service applications will flourish, enabling automated workflows and facilitating access to data management services without the need for manual oversight. This move towards a more integrated, automated IT environment is not just anticipated; it is imminent, reflecting a broader shift towards efficiency and interconnectivity within the technological landscape.

Finally, as we look toward 2024, we predict that an intensified focus on risk management will become a strategic imperative for companies worldwide.  Governance, risk, and compliance (GRC) practices are anticipated to receive heightened attention as companies grapple with the complexities of managing access to data, aging data, orphaned data, and illegal/unwanted data, recognizing these as potential vulnerabilities. Moreover, immutable object storage and offline archival storage will continue to be essential tools in addressing the diverse risk management and data lifecycle needs within the market.”

Rohit Badlaney, CGM, Cloud Product and Industry Platforms at IBM

Businesses Must Close Skills Gaps to Ensure Hybrid Cloud Success in 2024

“In the past few years, hybrid cloud adoption has accelerated at an exponential rate with no signs of slowing down into 2024. However, businesses will face persistent challenges along their hybrid cloud journeys due to the widening skills gap in the tech workforce. In fact, a 2023 global survey from the IBM found more than half of global decision-makers say that cloud skills remain a challenge to widespread cloud adoption. In response to this challenge, 72 percent of organizations have created new positions to meet new and evolving demands for cloud skills. As organizations refine their multi- and hybrid cloud strategies, they must take a comprehensive approach that addresses skills gaps — creating opportunities to expand current workers’ skills and welcoming new skilled talent. “

 

Register for Insight Jam (free) to gain exclusive access to best practices resources, DEMO SLAM, leading enterprise tech experts, and more!

The post 87 Data Protection Predictions from 46 Experts for 2024 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
33 Data Privacy Week Comments from Industry Experts in 2023 https://solutionsreview.com/security-information-event-management/data-privacy-week-comments-from-industry-experts-in-2023/ Wed, 25 Jan 2023 19:08:24 +0000 https://solutionsreview.com/backup-disaster-recovery/data-privacy-week-comments-from-industry-experts-in-2023/ For Data Privacy Week, the editors at Solutions Review have compiled a list of comments from some of the top leading industry experts. As part of Data Privacy Week (January 22-28) we called for the industry’s best and brightest to share their Identity Management, Endpoint Security, and Information Security comments. The experts featured represent some of the […]

The post 33 Data Privacy Week Comments from Industry Experts in 2023 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Data Privacy Week

For Data Privacy Week, the editors at Solutions Review have compiled a list of comments from some of the top leading industry experts.

As part of Data Privacy Week (January 22-28) we called for the industry’s best and brightest to share their Identity ManagementEndpoint Security, and Information Security comments. The experts featured represent some of the top Cybersecurity solution providers with experience in these marketplaces, and each projection has been vetted for relevance and ability to add business value.


Widget not in any sidebars

33 Data Privacy Week Comments from Experts


Chander Damodaran, CTO at Brillio

Digital adoption has taken a quantum leap, forever changing how organizations, industries, societies and people operate and behave. The pandemic accelerated the digitalization of customer interactions by several years, and there’s no turning back: we now live in an era of digital. With higher levels of digitalization, the volume of data has greatly expanded– which, in turn, has attracted new types of cybercrimes and attacks.

As more organizations become data driven, where data is used to make key business decisions rather than human intuition, it’s imperative to embrace a “data culture” that impinges on data democratization to provide unfettered, enterprise-wide data access to everyone in the organization, while ensuring data privacy controls and security are fully baked into the organization’s data strategy.

Soumendra Mohanty, Chief Strategy Officer at Tredence

Companies today face a host of data privacy issues and challenges, including a proliferation of data that needs to be protected and the rising costs that must be incurred do so, federal and local privacy regulations, the ongoing threat of cyberattacks and crime, new and advanced technologies that can be both helpful and challenging to employ, increased scrutiny and more. By understanding these issues, companies can take steps to better protect their data and improve their data privacy practices. It is imperative to prepare for the future now. More regions and states continue to add specific privacy laws and regulations– in the next year, about 65 percent of the world will have data privacy regulations in place, according to Gartner, so companies must review their business strategies when it comes to data protection and update accordingly to prevent unpleasant surprises.

If you haven’t already, take an inventory of your data and ensure your policies are up-to-date regarding data storage and sharing. Communicate internally and externally to ensure your stakeholders are aware of your data protection and offer transparency around your practices. Companies have the choice of crafting internal data privacy management policies and practices internally or engaging an organization proficient in helping companies establish such practices. While no single checklist will suit every company, solid data privacy management incorporates access control, cybersecurity planning, device security, end-user training, ongoing updates, strong password policies, secure communications, data backup, and ongoing review with nimble adjustments as needed.

Shalabh Singhal, CEO at Trademo

Producing one reasonably complex product requires tens of thousands of parts, and most of these components are sourced from a vast geographical area and an extensive network of suppliers. On top of that, these suppliers themselves outsource their material to second-tier suppliers. This chain of activities results in an increasingly complex, geographically vast, and multi-tiered supply network. Technologies such as supply chain mapping will increasingly help in discovering dependencies beyond tier-1 suppliers, identifying and eliminating toxic and dangerous raw materials; and reduce the quantity and toxicity of all emissions across the supply chain. Supply chain mapping will grow in importance in 2023 as it also helps in identifying concentration risk or compliance risk, allowing businesses to see the early warning signals, predict potential disruptions, identify supply chain bottlenecks and take proactive measures to mitigate risks, and maintain competitiveness.

Angie Tay, Group Chief Operating Officer at TDCX

It’s a challenging time for customer experience leaders, with the availability of multiple channels of communication, change in customer expectations and economic headwinds all in play. Customer acquisition and retention are becoming increasingly important, and the right customer experience strategy will help companies achieve revenue growth and profitability.

In terms of innovation, finding appropriate ways to provide convenience to the end-user, with effective ways to measure customer satisfaction will be key. With digital marketplaces integrating product experience with apps, there should be extra emphasis on how to make that integrated experience seamless and personalized. Business leaders should also increase the capabilities required for digital trustworthiness whether it is privacy protection or data security. Artificial intelligence and automation will play an important role in delivering better service and increased productivity and these areas should be leveraged.

Aron Brand, CTO at CTERA

After many years of near-zero interest rates and workforce shortages, the tide has turned, and in 2022 we have seen the start of a dramatic shift in the business landscape. As we enter 2023, capital is going to be much more expensive, and interest rates will continue to rise, resulting in the increasingly attractiveness of low CAPEX, high OPEX business models, such as cloud computing and software-as-a-service. With the deflation of the bubble, many companies will be forced to downsize or even close their doors, but those that have made the shift to cloud-based models will be better-positioned to weather the storm, due to the inherent elasticity and flexibility of the cloud.

Noam Shendar, Vice President of Business Development at Zadara

In 2022, the hyperscalers’ progress in edge computing initiatives was underwhelming, and it is leaving room in 2023 for upstarts to gain an edge. Despite the overall decrease in venture capital and private equity funding events, edge computing players will continue to see investment money pouring in over the course of 2023. There will be edge M&A activity as the technology matures and presents a credible alternative to hyperscale clouds.

Raveesh Dewan, CEO at Joget

Systems built in the last 20 years are coming to a point where the resource requirements are prohibitive. The risk of still running those systems, and the need for business change will push many of these systems over the edge resulting in significant demand for rebuilding new ones. With various government agencies and larger organizations across the globe, the starting point of service requests will move out of the hands of processors and into the consumer’s hands. The journey has already started with self-service systems, and it will continue for the next few years.

New systems will be a collection of smaller applications working harmoniously for better risk management and future outlook. Gone are the days when we implemented large ERP-like systems. In the next few years, moving fast will be a competitive advantage and that will happen by virtue of building a large system with smaller applications that have their own independent life cycle and adding new ones to the ecosystem can happen seamlessly.

Ken Barth, CEO at Catalogic Software

Digital transformation initiatives will continue to be a top corporate priority for 2023 given the early results in improving operational quality, scalability, and lowering costs. DevOps and agile processes in support of digital initiatives are in turn driving the usage of containers, with Kubernetes as the de-facto container orchestration and management platform. As these dynamic applications based on Kubernetes move into production and generate business-critical data, the data generated by these workloads needs to be backed up for business continuity and compliance purposes.

In 2023, organizations will adopt a multi-cloud Kubernetes strategy for flexibility, security, and cost savings. Given the public clouds like AWS, Azure and Google Cloud are highly available and reliable, DevOps and IT Ops teams may believe that their data is safe and secure in the cloud, such that they don’t need to do backups. Your data is always your responsibility, whether it is in a public cloud, in a SaaS application, or on-premises, especially for meeting regulatory and compliance policies. It would be foolish in today’s world of geo-political disruptions, natural disasters, and security breaches to not also follow the data protection best practices for cloud workloads that you follow for on-premises ones.

Alec Nuñez, Director of Business Compliance at Poll Everywhere

Data Privacy Day (January 28) commemorates the 1981 signing of Convention 108, the first international, legally binding treaty focused on privacy and data protection. This day is celebrated all over the world—and for very important reasons. The speed at which technology has, and will continue to, advance has inherently increased the importance of focusing on data privacy for any organization; protecting both company and customer data has always been a top priority, and while many companies continue to deploy new solutions to safeguard data, malicious actors still find new ways to access and steal sensitive data. Protecting this data is more important now than ever.

The number one issue when it comes to data privacy is the lack of education and guidance for an organization’s team. Human error has been and will continue to be the number one cause of data security issues; there is no competition. Companies can significantly minimize the impact of it by crafting best practices and creating training programs for the handling of data with the intent that it become second nature for all. The principle of least privilege is a substantial foundation all companies can establish when it comes to mitigating data security risks. This concept states that a user or entity should only have access to the data, resources, and applications required to execute a task. In other words, only provide individuals access to what they actually need. This is a basic idea to implement, but it will have a huge impact, permeating your organization’s system.

Almog Apirion, CEO and Co-Founder at Cyolo

Data Privacy Day aims to increase awareness over the need to protect employee and customer data while adhering to regulatory laws such as GDPR or CCPA. Even if newer regulations are highlighting today’s major need for data protection, this is not something new – in fact, the first legally binding international privacy and data protection treaty, Convention 108, was signed well before today’s regulations in 1981. Because of our greater reliance on digital technology to govern most of both individual and organization facets, it is important to reconsider what, when and where as well as with whom it is shared with others. Data Privacy Day is a component of the worldwide “STOP. THINK. CONNECT.” campaign for online privacy, security and safety.

Strong data privacy is more critical than ever– particularly in response to the recent growth of cyberattacks and the expansion of data perimeters due to hybrid work. One way of mitigating today’s vulnerabilities is to provide rigorous identity-based access control. To safeguard themselves, enterprises’ collaboration and communications tools require a robust zero-trust framework to protect all forms of user data. Identity-based access control enables businesses to strengthen their security posture while also gaining visibility and control over their most critical systems. The reality is that hackers today don’t break in, they log in. Enterprises can get complete control and visibility of their entire IT infrastructure while mitigating against advanced threats by implementing a modern zero-trust solution and adopting stringent authentication requirements. As more risks emerge, organizations will be more prepared than ever to counter threats and safeguard data and business-critical infrastructure.

Carl D’Halluin, CTO at Datadobi

A staggering amount of unstructured data has been and continues to be created. In response, a variety of innovative new tools and techniques have been developed so that IT professionals can better get their arms around it. Savvy IT professionals know that effective and efficient management of unstructured data is critical in order to maximize revenue potential, control costs, and minimize risk across today’s heterogeneous, hybrid-cloud environments. However, savvy IT professionals also know this can be easier said than done, without the right unstructured data management solution(s) in place. And, on Data Privacy Day we are reminded that data privacy is among the many business-critical objectives being faced by those trying to rein-in their unstructured data.

The ideal unstructured data management platform is one that enables companies to assess, organize, and act on their data, regardless of the platform or cloud environment in which it is being stored. From the second it is installed, users should be able to garner insights into their unstructured data. From there, users should be able to quickly and easily organize the data in a way that makes sense and to enable them to achieve their highest priorities, whether it is controlling costs, CO2, or risk– or ensuring end-to-end data privacy.

Don Boxley, CEO and Co-Founder at DH2i

The perpetual concern around data privacy and protection has led to an abundance of new and increasingly stringent regulations around the world. According to the United Nations Conference on Trade and Development (UNCTAD), 71 percent of countries now have data protection and privacy legislation, with another 9 percent having draft legislation. This increased scrutiny makes perfect sense. Data is being created and flowing not just from our business endeavors, but countless personal interactions we make every day – whether we are hosting an online conference, making an online purchase, or using a third party for ride-hailing, food delivery, or package transport.

Today, as organizations endeavor to protect data — their own as well as their customers’ — many still face the hurdle of trying to do so with outdated technology that was simply not designed for the way we work and live today. Most notably, many organizations are relying on virtual private networks (VPNs) for network access and security. Unfortunately, both external and internal bad actors are now exploiting VPN’s inherent vulnerabilities. However, there is light at the end of the tunnel. Forward looking IT organizations have discovered the answer to the VPN dilemma. It is an innovative and highly reliable approach to networking connectivity– the Software Defined Perimeter (SDP). This approach enables organizations to build a secure software-defined perimeter and use Zero Trust Network Access (ZTNA) tunnels to seamlessly connect all applications, servers, IoT devices, and users behind any symmetric network address translation (NAT) to any full cone NAT– without having to reconfigure networks or set up complicated and problematic VPNs. With SDP, organizations can ensure safe, fast and easy network and data access; while ensuring they adhere to internal governance and external regulations compliance mandates.

Steve Santamaria, CEO at Folio Photonics

It is no secret that data is at the center of everything you do. Whether you are a business, a nonprofit, an educational institution, a government agency, or the military, it is vital to your everyday operations. It is therefore critical that the appropriate person(s) in your organization have access to the data they need anytime, anywhere, and under any conditions. However, it is of the equal importance that you keep it from falling in the wrong hands.

Therefore, when managing current and archival data, a top concern must be data security and durability, not just today but for decades upon decades into the future. The ideal data storage solution must offer encryption and WORM (write-once, read-many) capabilities. It must require little power and minimal climate control. It should be impervious to EMPs, salt water, high temps, and altitudes. And, all archive solutions must have 100+ years of media life and be infinitely backward compatible, while still delivering a competitive TCO. But most importantly, the data storage must have the ability to be air-gapped as this is truly the only way to prevent unauthorized digital access.

Surya Varanasi, CTO at Nexsan

Digital technology has revolutionized virtually every aspect of our lives. Work, education, shopping, entertainment, and travel are just a handful of the areas that have been transformed. Consequently, today, our data is like gravity – it’s everywhere. On Data Privacy Day, we are reminded of this fact, and the need to ensure our data’s safety and security. Fortunately, there are laws and regulations that help to take some of the burden off of our shoulders; such as the General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA), and Health Insurance Portability and Accountability Act (HIPAA).

However, some of the responsibility remains on our shoulders as well as those of the data management professionals we rely upon. Today, it would be extremely challenging to find an organization (or an individual for that matter) that isn’t backing up their data. Unfortunately however, today that just isn’t enough. Cyber criminals have become increasingly aggressive and sophisticated, along with their ransomware and other malware. And now, the threat isn’t just that they will hold your data until payment, cyber criminals are now threatening to make personal and confidential data public, if not paid. It is therefore critical that cyber hygiene must include protecting backed up data by making it immutable and by eliminating any way that data can be deleted or corrupted.

This can be accomplished with an advanced Unbreakable Backup solution, which creates an immutable, object-locked format, and then takes it a step further by storing the admin keys in another location entirely for added protection. With an Unbreakable Backup solution that encompasses these capabilities, users can ease their worry about the protection and privacy of their data, and instead focus their expertise on activities that more directly impact the organization’s bottom-line objectives.

Andrew Russell, Chief Revenue Officer at Nyriad

Data Privacy Day serves as a great reminder of the value and power of data. In addition to your people, data is without question the most strategic asset of virtually any organization. Data and the ability to fully leverage, manage, store, share, and protect it, enables organizations to be successful across virtually every facet – from competitive advantage, to innovation, the employee experience, and customer satisfaction, to legal and regulations compliance competency.

Consequently, savvy data management professionals recognize that while a storage solution that is able to deliver unprecedented performance, resiliency, and efficiency with a low total cost of ownership is priority number one to fully optimize data and intelligence for business success; they likewise need to ensure they have the ability to protect against, detect, and restore data and operations in the event of a successful cyber-attack in order to protect their data, for business survival.

Brian Dunagan, Vice President of Engineering at Retrospect

Every organization, regardless of size, faces the real possibility that they could be the next victim of a cyberattack. That is because today’s ransomware, which is easier than ever for even the novice cybercriminal to obtain via ransomware as a service (RaaS), strikes repeatedly and randomly without even knowing whose system it is attacking. Ransomware now simply searches for that one crack, that one vulnerability, that will allow it entry to your network. Once inside it can lock-down, delete, and/or abscond with your data and demand payment should you wish to keep your data private and/or have it returned.

As an IT professional, it is therefore critical that beyond protection, steps be taken to detect ransomware as early as possible to stop the threat and ensure their ability to remediate and recover. A backup solution that includes anomaly detection to identify changes in an environment that warrants the attention of IT is a must. In order to ensure its benefit,, users must be able to tailor the backup solution’s anomaly detection to their business’s specific systems and workflows; with capabilities such as customizable filtering and thresholds for each of their backup policies. And, those anomalies must be immediately reported to management, as well as aggregated for future ML/analyzing purposes.

Peter Kreslins, CTO at Digibee

When reviewing data privacy and cybersecurity, it’s important to consider integrations and create an enterprise integration strategy including data privacy and security procedures and technologies, which are critical requirements for governance, compliance and customer trust. An integration platform solution such as an enterprise integration platform as a service (eiPaaS) includes a dashboard enabling transparent and comprehensive monitoring and reporting of the data privacy and security of every integration, rather than laborious and time-consuming review of log files. You also can encrypt all data passing through the integrations and the platform so that data cannot be breached to expose personally identifiable information.

George Waller, Co-Founder and EVP at Zerify

The most valuable commodity today is data. With data, you have identities, corporate information and proprietary health care details, and 2023 will only lead to an explosion of more data as more companies rely on video conferencing. Video conferencing now plays a critical role in how businesses interact with their employees, customers, clients, vendors, attorneys and many others. Organizations use video conferencing to discuss M&A, legal, military, healthcare, intellectual property and other topics, and even corporate strategies. Almost all of that data falls under one of the compliance regulators because it’s considered sensitive, confidential or even classified. A loss of data like that could be catastrophic for a company, its employees, its clients and its customers. According to the latest IBM breach report, the average size of a data breach in the U.S. is now $9.44 million, and 60 percent of small businesses go out of business within six months of a data breach.

Tilo Weigandt, COO and co-founder of Vaultree

A zero-trust framework powered by AI and machine learning is not the only solution to best protect your data privacy. Other approaches include using encryption, implementing strict access controls, and regular monitoring and auditing systems. It is important to note that data privacy is a complex issue and there is no one-size-fits-all solution. Organizations should consult experts to determine the best approach for their specific needs and requirements, especially with data privacy rules certain to get more strict. State-level momentum for privacy bills is at an all-time high to regulate how consumer data is shared. Recent developments such as the California Privacy Rights Act, the quantum computing security legislation, and Virginia Consumer Data Protection Act clearly show that protecting consumer privacy is a growing priority in the U.S.

Dan LeBlanc, CEO at Daasity

Consumer brands collect mountains of consumer data. Not only are data leaks and data theft major issues, but hackers gaining access to operational or business intelligence tools is a risk as well.

When a consumer brand takes orders directly from customers, data is collected at multiple stages—during ordering, fulfillment, and servicing of accounts. This data can live across several platforms, and it’s typically ingested and stored in a central database for analytics. Each of these platforms poses a risk that someone can steal login information or a password and export customer data. This isn’t some sophisticated hack but a simple, “I got your password and was able to login.”

To prevent this, companies must ensure that all their systems have some two-factor or multi-factor authentication (TFA/MFA) turned on, that access controls are in place to restrict unauthorized individuals from certain internal tools and data, and that sensitive data is masked via anonymization or tokenization. With several layers of defense in place, make it harder for a hacker to export all your customer information into an excel spreadsheet.

Yossi Appleboum, CEO at Sepio

Corporations will need to be aware of the risk level that assets pose and handle them by ascribing an “asset risk factor score” to each device on the network. Hardware assets – i.e., wireless combo keyboards and mice, which are known to be vulnerable — can easily be used to sniff out and capture sensitive data. There are multiple data leakage options using hardware assets that bypass existing security solutions (i.e., capturing a user screen by running an HID scripting tool and exfiltrating the information through public comments to video platforms).

Wendy Mei, Head of Product and Strategy at Playsee

Developing an engineering team to enhance social-media AI technology and scan for spammer activity or illegal content is an important data-privacy undertaking. But you can’t leave it all to the algorithms. Building and improving a real, human team to constantly monitor and review content will assist users in reporting what makes them uncomfortable– to an actual person who has feelings, too.

Having to answer, ‘What is your birthday?’ may seem intrusive but is vital for platforms to provide a safer environment and experience for younger users. And that’s just the first step. Going further, deep and deliberate development of AI and a dedicated content review system will ensure that all posts follow strict guidelines. Content moderation, especially on social media, will continue to innovate its tech stack to better protect children.

Cindi Howson, Chief Data Strategy Officer at ThoughtSpot

In a digital economy, we are creating, capturing, and sharing more personal data than ever before. Companies rely on customer data more than ever to create actionable insights to personalize services, operate more efficiently and drive business growth. We’re living in the “decade of data”– and with this comes, of course, the decade of data privacy.

Privacy now extends far beyond protecting ourselves physically and encompasses everything we do or interact with digitally: our online footprint, often referred to as our digital twin. We’ve seen a raft of high-profile data breaches in the spotlight this past year which has fueled public concern around data privacy. As companies become more data dependent, customers become even more reluctant to share data while citizens remain woefully ignorant about data collected on them. It is this tension and misalignment that needs to be properly addressed in order to unlock data’s full potential.

Those working with customer data within any business need to be vigilant about how personal data is collected, stored, and used, as well as the implications of failing to handle this data correctly. Behind this data are real people, many of whom will not hesitate to take their business elsewhere should their data be lost or exposed. Ensuring data privacy is not just a technology issue, it’s also about company culture, process, and controls. And with analysts now able to extract increasing amounts of data from even more internal and external sources, ensuring data privacy must be part of an organization’s DNA. Dumping data from analytics tools to spreadsheets remains a weak link. Nowadays, laws and regulations such as GDPR and CCPA place stricter requirements on organizations, while giving individuals more access and rights around their data. Data Privacy Day, and the extended Data Privacy Week, is our opportunity, as businesses and data leaders, to bring awareness to those persistent knowledge gaps, take a closer look at best practices around data, and open up the conversation around data privacy and protection.

Rebecca Krauthamer, Co-Founder and CPO at QuSecure

Ahead of Data Privacy Day January 28, it is advisable that federal agencies, commercial organizations and other infrastructure providers begin to immediately assess potential vulnerabilities in their current encryption and cybersecurity practices and start planning for post-quantum encryption. Some believe that building a quantum computer powerful enough to break encryption is a decade or more away. Others believe it’s already too late. While quantum computers powerful enough to crack RSA are not yet available, hackers are seizing and storing sensitive data knowing they will be able to use quantum technology to access it soon.

We know that well-funded hacking organizations and governments are constantly working on novel ways to accelerate quantum development including advance error correction, combinations of individual quantum processors, and advanced physical architectures to become the first to wield the power of quantum decryption. We are most likely closer to more quantum power and the subsequent associated threats to standard encryption than expected. Every day we don’t convert our security posture to a quantum-safe one, there’s no recovering from the damage that will be done.

Kelly Ahuja, CEO of Versa Networks

As Data Privacy Day approaches January 28, we need to be reminded that users are connecting from everywhere to systems and applications via private and public clouds, dissolving the enterprise perimeter. They are also connecting devices such as phones, tablets and laptops, to both their work and home networks, ultimately, blurring the divide which was once there. The expanding attack surface is providing the perfect entry point for threat actors. Once they have penetrated the perimeter, threat actors can move laterally across the network, accessing sensitive data and exfiltrating it before security teams have even had time to react. With hybrid work becoming the new normal and the increasing demand for Internet of Things (IoT) devices, the traditional approaches to cybersecurity and data protection are no longer sufficient.

Essentially, the next major data breach could start from someone’s home tablet or laptop. There is a clear problem when it comes to security in the remote working world, however, it is pointless securing networks if that solution hampers both connectivity and performance. Companies need to take a closer look at deploying Secure Access Service Edge (SASE) technologies that ensure the entire network is visible, including all connecting remote workers and IoT devices. SASE also ensures networks are segmented, restricting the movement of malware and allowing security teams to quickly locate, detect and mitigate cyberattacks.

Eve Maler, CTO at ForgeRock

This Data Privacy Week, it’s critical to pay close attention to the increased use of artificial intelligence (AI) in the age of social media. Consumer-accessible AI is increasingly making its way into popular social media sharing applications. For instance, enhancing self portraits with AI to then share with followers on social media is the latest trend in photo editing applications. However, in doing so, consumers are handing over biometrically-significant data — a dozen or more photos of your face — to an unknown third party. The service’s privacy policy, security protections, AI model particularities, and trustworthiness gain new importance in light of the need to share so much data.

Biometrics have special requirements when it comes to keeping personal data safe and secure. Service providers need to make ethical management of biometric data a guiding principle. Pay special attention to meaningful user consent and to oversight of data management. Performing facial recognition also exposes the service to a wealth of derivable personal data, such as age, gender, ethnicity, and health. Decentralized device-based storage of biometric data is always safest.

Molly Presley, SVP of Marketing at Hammerspace

With global rules governing how data should be stored, used, and shared, combined with escalating data losses, explosive personal data growth, and customer expectations, addressing data privacy is now an obligatory business requirement. However, as organizations expand and navigate compliance and legal requirements in the rapidly evolving age of big data, AI/ML, and government regulations, the existing processes surrounding data privacy need to evolve to 1) automate processes and 2) scale to meet increasingly complex new challenges.

Privacy and security concerns increasingly impact multiple vertical markets, including finance, government, healthcare and life sciences, telecommunications, IT, online retail, and others, as they quickly outgrow legacy data storage architectures. As a result, there is increasing pressure to develop and implement a data strategy and architecture for decentralized data that is more cohesive, making access to critical information simplified and secure.

To protect the organizations’ and individual users’ sensitive data, organizations must take the steps necessary to control how data is shared and eliminate the proliferation of data copies outside the controls of IT security systems. Accelerating IT modernization efforts while managing the ever-increasing volumes of data requires a data solution that simplifies, automates, and secures access to global data. Most importantly, to ensure data privacy and secure data collaboration, a data solution must be able to put data to use across multiple locations and to multiple users while simplifying IT Operations by automating data protection and data management to meet policies set by administrators.

Nick Hogg, Director of Technical Training at Fortra

With the rise of remote working, sharing sensitive files is now taken for granted. Therefore, awareness days and weeks, like Data Privacy Week, are a great way to remind organizations and their stakeholders of the importance of storing and handling data properly.

It’s essential for organizations to re-evaluate their security awareness and compliance training programs to move away from the traditional once-a-year, ‘box-ticking’ exercises that have proven to be less effective. The goal is to deliver ongoing training that keeps data security and compliance concerns front and center in employees’ minds, allowing them to better identify phishing and ransomware risks, as well as reducing user error when handling sensitive data.

They will also need to use digital transformation and ongoing cloud migration initiatives to re-evaluate their existing data loss prevention and compliance policies. The goal is to ensure stronger protection of their sensitive data and meet compliance requirements, while replacing complex infrastructure and policies to reduce the management overhead and interruptions to legitimate business processes.

Wade Barisoff, Director of Product, Data Protection at Fortra

As new states contemplate their own flavors of data privacy legislation, the only consistency will be the fact that each new law is different. We are already seeing this now; for example, in California, residents can sue companies for data violations, whereas in others it’s their attorney general’s offices that can impose the fines. In Utah, standards apply to fewer businesses compared to other states. As each state seeks to highlight how much they value their citizens’ rights over the next, we’ll see an element of (for example), ‘What’s good for California isn’t good enough for Kansas’ creep in, and this developing complexity will have a significant impact on organizations operating across the country.

Before GDPR there were (and still are) many different country laws for data privacy. GDPR was significant, not because it was a unifying act that enshrined the rights of people and their digital identities to govern how their data could be handled, but it was the first legislation with real teeth. Fines for non-compliance were enough to force companies into action.

So far, five states have (or will have) individual laws, but there are 45 more yet to come. The amount of money and time companies will spend enacting the proper controls for these individual privacy laws fuels the argument for a more unified national approach to data privacy standards, as the penalties for non-compliance are significant. Also, as states begin to increase the demands on business, usually without fully understanding the technology landscape and how businesses work with shared and cloud-based technologies, there’s a potential that companies will be forced to make the decision not to conduct business in certain areas. A national approach would allow businesses to tackle data privacy once, but as it stands, with the federated states model, doing business within the U.S. is likely to get more complicated and expensive.

Jeff Sizemore, Chief Governance Officer at Egnyte

Data Privacy Day reminds us that personal privacy is being viewed more and more as a global human right– by 2024, it’s predicted that 75 percent of the world’s population will be protected under modern data privacy regulations. We will continue to see data privacy gain significant traction across industries and business disciplines, such as with personal financial data rights. Company trust will increasingly have a larger impact on customers’ buying decisions as well.

In the U.S., five states (California, Virginia, Colorado, Connecticut and Utah) have already enacted or plan to enact data privacy legislation this year. And the movement toward a federal law is only a matter of time, as we have seen positive momentum with the American Data Privacy and Protection Act (ADPPA). Without a doubt, as government entities and regulatory bodies show increased interest in data privacy, we can anticipate stronger enforcement mechanisms. Enforcement of regulations will become more strict, with fines and litigation for noncompliance expected to increase.

There’s no time like the present to prepare for these business-impacting regulations, especially with more on the horizon. Organizations can take proactive steps like keeping data privacy policies up-to-date and gaining visibility into structured and unstructured data. Ultimately, companies that respect data privacy and understand the short- and long-term benefits of compliance will be well-positioned for the future.

Christopher Rogers, Technology Evangelist at Zerto

In 2023, data is the most valuable asset any company owns. Whether it’s the organization’s own data or its customers,’ the potential loss of revenue should this data be compromised is huge. Therefore, the primary concern for all businesses should be protecting this asset.

Unfortunately, in the golden age of cybercrime, data protection is not such an easy task. In 2022, an IDC report, ‘The State of Ransomware and Disaster Preparedness’ found that 83 percent of organizations had experienced data corruption from an attack, and nearly 60 percent experienced unrecoverable data as a result. While it’s clear there is a dire need for more effective data protection, it is also crucial that businesses have disaster recovery solutions in place should the worst occur.

When it comes to ransomware, the biggest financial killer is the downtime. Therefore, having a disaster recovery solution based on continuous data protection (CDP) in conjunction with backup is vital to equip companies with the ability to be resilient in the face of potentially catastrophic circumstances. Companies using CDP can limit downtime and restore operations in a matter of seconds or minutes, rather than days or weeks.

This Data Privacy Day, I want to encourage businesses to not only look at what they can be doing to protect themselves but also what solutions they have in place to recover should disaster strike.

Tomer Shiran, CPO and Co-Founder of Dremio

Data privacy is a fundamental human right and is becoming increasingly important in the digital age as more personal information is collected, stored, and shared online. Organizations have a responsibility to protect the data privacy of individuals and ensure that personal information is handled in a responsible and ethical manner. Data privacy laws, like GDPR in the European Union and California’s CCPA, have been put in place to give individuals more control and to hold organizations accountable for data breaches and mishandling of personal information, but data privacy is a constantly evolving field. A data lakehouse should be designed with privacy in mind, processing organizational data on the customer’s premises and never storing it anywhere in the lakehouse’s infrastructure. This reduces data proliferation dramatically and helps organizations use their existing controls to safeguard their own data and their customers’ data.

Frank Baalbergen, Chief Information Security Officer at Mendix

No-code and low-code technologies are continuing to gain traction throughout the enterprise. Gartner predicts that by 2025 70 percent of new enterprise applications will be created in low-code development environments, up from just 25 percent in 2020. Nevertheless, data privacy and security leaders often worry about losing visibility and control when implementing low-code solutions. Low-code application platforms, such as Mendix, integrate governance into all applications to provide a secure environment you can count on to remain competitive. When data governance is proactively implemented, everyone’s privacy will be better protected, so low-code adoption shouldn’t be deterred by concerns about data privacy and regulatory compliance as low-code application platforms, such as Mendix, support your business from the get-go.



Widget not in any sidebars

The post 33 Data Privacy Week Comments from Industry Experts in 2023 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Solutions Review Releases 2022 Buyer’s Guide for Backup and Disaster Recovery Solutions https://solutionsreview.com/backup-disaster-recovery/solutions-review-releases-2022-buyers-guide-for-backup-and-disaster-recovery-solutions/ Tue, 07 Dec 2021 13:00:41 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=4649 Solutions Review is announcing the release of its new 2022 Buyer’s Guide for Backup and Disaster Recovery Solutions in an attempt to help organizations select the best software. Solutions Review today is releasing its newly updated 2022 Buyer’s Guide for Backup and Disaster Recovery Software to assist organizations during the research and discovery phase of […]

The post Solutions Review Releases 2022 Buyer’s Guide for Backup and Disaster Recovery Solutions appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Solutions Review Releases 2022 Buyer’s Guide for Backup and Disaster Recovery Solutions

Solutions Review is announcing the release of its new 2022 Buyer’s Guide for Backup and Disaster Recovery Solutions in an attempt to help organizations select the best software.

Solutions Review today is releasing its newly updated 2022 Buyer’s Guide for Backup and Disaster Recovery Software to assist organizations during the research and discovery phase of buying business software. Gathered via a meta-analysis of available online materials, Solutions Review editors compile each Buyer’s Guide via research, analyst reports, conversations with subject matter experts and vendor representatives, and the examination of product demonstrations. The Buyer’s Guide for Disaster Recovery as a Service has also been updated for the new year.

Top providers highlighted include:

Solutions Review is a collection of business software news and resource sites that aggregates, curates, and creates the leading content to connect buyers and sellers. Over the past six years, Solutions Review has launched 20 distinct and category-specific Buyer’s Guide sites for technologies ranging from Cybersecurity to Big Data and WorkTech, as well as Identity and Access Management, Endpoint Security, Data Analytics and Data Management, Enterprise Resource Planning and Business Process Management, and Enterprise Cloud and Network Monitoring.

In addition to the 2022 Buyer’s Guide for Backup and Disaster Recovery Solutions, readers are encouraged to view research in other related Solutions Review coverage areas:

Solutions Review also drastically increased its presence in the virtual events space in 2021. Headlined by the Solution Spotlight series and our popular Demo Days, Solutions Review is emerging as a major multimedia influencer in enterprise technology. This success led us to explore further the key aspects of virtual events in the post-COVID era, and we plan to do many more private webinars in the year ahead.

Nearly 10 million technology professionals will use Solutions Review to help them evaluate business software this year. Solutions Review is a completely vendor-agnostic resource and does not endorse any individual product or service.

Download Solutions Review’s 2022 Buyer’s Guide for Backup and Disaster Recovery Solutions.

The post Solutions Review Releases 2022 Buyer’s Guide for Backup and Disaster Recovery Solutions appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Solutions Review Set to Host Third BUDR Insight Jam https://solutionsreview.com/backup-disaster-recovery/solutions-review-set-to-host-third-budr-insight-jam/ Tue, 12 Oct 2021 17:59:42 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=4607 When Solutions Review was founded in 2012, it was with a simple goal: to report on the latest developments in enterprise technology and make it easier for people to evaluate business software. We then built a collection of vendor-agnostic buyer’s resources to cut through the clutter of content and strip away the marketing hyperbole.  Solutions […]

The post Solutions Review Set to Host Third BUDR Insight Jam appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Solutions Review Set to Host Third BUDR Insight JamWhen Solutions Review was founded in 2012, it was with a simple goal: to report on the latest developments in enterprise technology and make it easier for people to evaluate business software. We then built a collection of vendor-agnostic buyer’s resources to cut through the clutter of content and strip away the marketing hyperbole. 

Solutions Review is organizing the third annual BUDR Insight Jam for the month of December — a one-day community web event dedicated to raising awareness around best practices when evaluating, deploying, and using backup and disaster recovery solutions. Our editors will be sharing tips and expert insights throughout the day to help practitioners plan for the end of the year and prepare for 2022. Solutions Review will also be releasing its new 2022 Buyer’s Guide for Backup and Disaster Recovery platforms so tool-seekers can get a jump start on identifying which solutions best fit their use case.

Join the largest BUDR software buyer and practitioner community

Solutions Review Backup and Disaster Recovery is the largest BUDR software buyer and practitioner community on the web. Our Universe of Influence reach is more than 7 million business and IT decision-makers, as well as C-suite and other top management professionals. Our readers primarily use us as an enterprise technology news source and trusted resource for solving some of their most complex problems.

Our collection of vendor-agnostic buyer’s resources help backup and disaster recovery buyers and practitioners during the research and discovery phase of a buying cycle. This critical stage of information gathering is where buyers narrow down the field of solution providers to a short-list they plan to engage. The mission of Solutions Review is to make it easier for buyers of BUDR software to connect with the best providers.

Why participate in the Solutions Review BUDR Insight Jam?

Wondering what’s in it for you? Join us for the BUDR Insight Jam to get insight on backup and disaster recovery software buying, best practices for piloting new and emerging technologies, and find out what the future will bring. It’s also going to be a top-notch networking event featuring many of the foremost thought leaders in the field.

This forum is a unifying event for all those in the industry, and we welcome industry analysts, experts, influencers, authors, practitioners, solution providers, and end-users to weigh in. Set for Tuesday, December 7, 2021, Solutions Review will be sharing, posting, and tweeting actionable backup and disaster recovery best practices content using the hashtag #BUDRInsightJam. If you are interested in participating, here are a few ways you can be a part of the event:

  • Provide us with an audio or video clip with advice to those considering a BUDR solution purchase
  • Send us short-form product demos
  • Participate in one of our discussion panels (as a speaker or a viewer)
  • Use the hashtag #BUDRInsightJam and share helpful content to help build a community space dedicated to this day
  • Predictions for 2022; what will next year bring in the space?
  • Customer success stories using backup and disaster recovery platforms — be specific!
  • General advice for those evaluating backup and disaster recovery tools

For more information on the 2021 Solutions Review BUDR Insight Jam, check out our video explaining it in detail:

How do I submit content for the BUDR Insight Jam?

The deadline for submissions is November 24th. Stay tuned for details about our other Identity and Access Management, ERP, and Business Intelligence Insight Jams, set to begin on Monday, December 6th, 2021.

About Solutions Review

Solutions Review is a collection of technology news sites that aggregates, curates, and creates the best content within leading technology categories. Solutions Review’s mission is to connect buyers of enterprise technology with the best solution sellers. Over the past seven years, Solutions Review has launched 16 technology buyer’s guides sites in categories ranging from cybersecurity to wireless 802.11 and mobility management, business intelligence and data analytics, data integration, and cloud platforms.

The post Solutions Review Set to Host Third BUDR Insight Jam appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Solutions Review Announces 2022 Virtual Event Calendar https://solutionsreview.com/solutions-review-announces-2022-virtual-event-calendar/ Wed, 06 Oct 2021 13:00:57 +0000 https://solutionsreview.com/backup-disaster-recovery/solutions-review-announces-2022-virtual-event-calendar/ Today, Solutions Review announced its virtual event calendar for the first half of 2022. Solutions Review President Doug Atkinson commented, “We realized the incredible reach we could achieve through these online events. With audience interest only increasing, we knew we had to go into 2022 with a complete virtual marketing strategy around creating connections between […]

The post Solutions Review Announces 2022 Virtual Event Calendar appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Solutions Review Announces 2022 Virtual Event CalendarToday, Solutions Review announced its virtual event calendar for the first half of 2022. Solutions Review President Doug Atkinson commented, “We realized the incredible reach we could achieve through these online events. With audience interest only increasing, we knew we had to go into 2022 with a complete virtual marketing strategy around creating connections between buyers and sellers of enterprise tech.”

Solutions Review has eight events scheduled for the first two quarters of 2022 – four Demo Days and four Solution Spotlights. Vendor spots are filling up quickly, so the company is asking interested solution providers to get in touch as soon as possible to be featured in an upcoming Solutions Review event.

The company created a virtual version of the tech conferences they once frequented in what they coined their “Demo Days.” These online events give up to four vendors of a particular solution category the opportunity to demo their products in front of a registered audience of attendees.

Since its first virtual event in June 2020, Solutions Review has expanded its multimedia capabilities in response to the overwhelming demand for these kinds of events. Solutions Review’s current menu of online offerings includes the Demo Day, Solution Spotlight, best practices or case study webinars, and panel discussions.

Solutions Review handles all of the promotion leading up to the events, and with the universe of influence they have created on social media, their reach is widespread. Their team of accomplished editors has published over 10,000 pieces of content indexed with search engines to drive in-market site traffic. They are an active part of nearly 300 LinkedIn IT groups, which reach over 20 million tech professionals worldwide. If you would like to lock in your spot for an upcoming Solutions Review event on the virtual event calendar, visit their events page today.

Register to be a part of Solutions Review’s 2022 Virtual Event Calendar

About Solutions Review

Solutions Review is a collection of technology news sites that aggregates, curates, and creates the best content within leading technology categories. Solutions Review’s mission is to connect buyers of enterprise technology with the best solution sellers. Over the past four years, Solutions Review has launched ten technology buyer’s guide sites in categories ranging from cybersecurity to wireless 802.11, as well as mobility management, business intelligence and data analytics, data integration, and cloud platforms.

The post Solutions Review Announces 2022 Virtual Event Calendar appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
What’s Changed: 2021 Gartner Magic Quadrant for IT Risk Management https://solutionsreview.com/backup-disaster-recovery/whats-changed-2021-gartner-magic-quadrant-for-it-risk-management/ Thu, 30 Sep 2021 19:41:41 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=4593 The editors at Solutions Review highlight what’s changed since the last iteration of Gartner’s Magic Quadrant for IT Risk Management and provide an analysis of the new report. Analyst house Gartner, Inc. has released its 2021 Magic Quadrant for IT Risk Management. The researcher defines IT risk management (ITRM) products as “software and services that […]

The post What’s Changed: 2021 Gartner Magic Quadrant for IT Risk Management appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Whats Changed 2021 Gartner Magic Quadrant for IT Risk ManagementThe editors at Solutions Review highlight what’s changed since the last iteration of Gartner’s Magic Quadrant for IT Risk Management and provide an analysis of the new report.

Analyst house Gartner, Inc. has released its 2021 Magic Quadrant for IT Risk Management. The researcher defines IT risk management (ITRM) products as “software and services that operationalize the risk management life cycle of cyber and IT risks in the context of an organization’s mission.” These tools are implemented in order to establish a centralized hub that simplifies and facilitates business-related risk management. ITRM platforms help security and risk management (SRM) professionals manage cyber and IT risks for four common use cases, namely, IT risk and control assessment; regulatory, industry, and policy compliance; cyber risk management; and integrated into enterprise risk management.

Though ITRM tools are primarily used for the aforementioned use cases, U.S. Federal organizations often use ITRM products to meet the current and future U.S. Federal compliance regulations for the assessment and authorization of systems. Additionally, the key capabilities of ITRM solutions include workflow management; data integrations and connectors; information and asset discovery and inventory; user access; risk analysis; risk treatment life cycle; board/senior executive reporting; near-real-time IT risk profiling; regulatory and policy content management; threat and vulnerability management integrations; and incident management integrations.

The market for ITRM products is expanding, with a high level of interest in stand-alone ITRM products or ITRM use cases within integrated risk management (IRM) platforms or governance, risk, and compliance (GRC) platforms, according to Gartner. The continually increasing focus on cybersecurity has led to a growing interest in ITRM features specific to cyber risk. Additionally, interest in ITRM initiatives is projected to continue because of cybersecurity and privacy mandates, as well as a digitally enabled, remote, or hybrid business operating environment.

Gartner predicts that by 2023, 80 percent of organizations with formal risk management programs will use an ITRM product to manage their cyber and IT risks, up from 45 percent today. Additionally, the recent introduction of new vendors has disrupted the market, causing a shift towards cloud-first deployments of ITRM. Because of this, many ITRM providers have slowly moved to a SaaS-first offering. In the future, Gartner expects that ITRM vendors will embed machine learning capabilities into their products on a larger scale, including natural language processing, embedded chatbots, and evidence suggestions based on previously given evidence.

In this Magic Quadrant, Gartner evaluated the strengths and weaknesses of 14 providers that it considers most significant in the marketplace and provides readers with a graph (the Magic Quadrant) plotting the vendors based on their Ability to Execute and their Completeness of Vision. The graph is divided into four quadrants: niche players, challengers, visionaries, and leaders. At Solutions Review, we read the report, available here, and pulled out the key takeaways.

Gartner adjusts its evaluation and inclusion criteria for Magic Quadrants as software markets evolve. While no vendors were added or dropped, three vendors changed names from past iterations of this report. Archer was rebranded from RSA Archer to Archer, SAI360 was rebranded from SAI Global to SAI360, and Diligent acquired Galvanize. Gartner also occasionally lists honorable mentions that did not meet the inclusion criteria, but are of interest to their clients due to their open-source approach and market momentum. This year’s honorable mentions are Camms, CyberSaint, and eramba.

Representative vendors in this year’s Magic Quadrant include Allgress, Archer, Diligent, IBM, LogicManager, MetricStream, NAVEX Global, OneTrust, Reciprocity, Riskonnect, SAI360, ServiceNow, SureCloud, and TechDemocracy.

The leader quadrant is the most densely populated this year, containing ServiceNow, Diligent, Archer, MetricStream, IBM, NAVEX Global, and SAI360. ServiceNow is placed highest with regard to the ability to execute. This status could be attributed to the provider having one of the highest R&D budgets among the vendors assessed in this report. ServiceNow’s closest competitor in this quadrant is Diligent, which is one of only two providers included in this Magic quadrant with an authority-to-operate (ATO) for its platform. This fulfills a primary qualifying criterion in cloud services procurement decisions for state and federal agencies.

Archer, MetricStream, and IBM are all grouped closely in the leader quadrant. Archer differentiates itself through its workflow process designer capability, which offers ease of use in zero to low-code workflow design, a modern user interface, and flexible actions or workflow nodes. MetricStream’s strength is its ability to adapt and consistently improve its roadmap in response to customer feedback and demand, as shown by its investment in improving user experience. Conversely, IBM touts the widest geographical presence in this report and also has a strong product vision for machine learning and artificial intelligence-driven risk and compliance management augmentation.

Rounding out the leaders are SAI360 and NAVEX Global. SAI360 is located closest to the Y-axis. This placement could be due to the provider’s predefined solution tailored to smaller organizations’ needs in both IT risk and cybersecurity program management. NAVEX Global was placed closest to the X-axis. The vendor will focus on enhancing UX by making UI improvements, evolving automated workflow, and adding in-line record editing capabilities.

This year’s challengers are all located close to the Y-axis of the graph, with OneTrust being placed directly on the axis itself. The location of OneTrust could be attributed to its robust in-house knowledge capital, product design, and experience. LogicManager earned the highest ability to execute among the challengers. The vendor provides each customer with a team of advisory analysts, based on their industry, who work with the end-user to implement the solution aligning to business needs.

The remaining challengers in this year’s report are Reciprocity and SureCloud. In 2021 and 2022, it’s expected that Reciprocity will continue expanding its benchmarking capabilities and its platform in order to support third-party risk. SureCloud, which is offered exclusively via SaaS, is looking to rearchitect its platform to optimize performance and flexibility.

There are no visionaries listed this year, leaving only the niche players. Allgress is located closest to both the X and Y-axis in this quadrant. Its solution is targeted mainly at SMBs in finance, healthcare, technology, state, or the federal government. Allgress also offers a range of deployment options. TechDemocracy, also a niche player, likely earned its status because it is one of the few products that focuses solely on cyber risk management as a stand-alone product. Finally, Riskonnect offers RK GoLive!, which introduces two implementation options to facilitate deployment by focusing on best-practice configuration or customer configuration.

Read Gartner’s Magic Quadrant for IT Risk Management.

Download Link to Data Protection Vendor Map

The post What’s Changed: 2021 Gartner Magic Quadrant for IT Risk Management appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
2021 Gartner Critical Capabilities for Enterprise Backup and Recovery Software Solutions: Key Takeaways https://solutionsreview.com/backup-disaster-recovery/2021-gartner-critical-capabilities-for-enterprise-backup-and-recovery-software-solutions-key-takeaways/ Mon, 16 Aug 2021 12:45:38 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=4505 The editors at Solutions Review highlight the key takeaways of Gartner’s 2021 Critical Capabilities for Enterprise Backup and Recovery Software Solutions. Analyst house Gartner, Inc. has released its 2021 Critical Capabilities for Enterprise Backup and Recovery Software Solutions, companion research to the popular Magic Quadrant report. Used in conjunction with the Magic Quadrant, the Critical […]

The post 2021 Gartner Critical Capabilities for Enterprise Backup and Recovery Software Solutions: Key Takeaways appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
2021 Gartner Critical Capabilities for Enterprise Backup and Recovery Software Solutions Key TakeawaysThe editors at Solutions Review highlight the key takeaways of Gartner’s 2021 Critical Capabilities for Enterprise Backup and Recovery Software Solutions.

Analyst house Gartner, Inc. has released its 2021 Critical Capabilities for Enterprise Backup and Recovery Software Solutions, companion research to the popular Magic Quadrant report. Used in conjunction with the Magic Quadrant, the Critical Capabilities report is an additional resource that can assist buyers of backup and disaster recovery solutions in finding the products that best fit their organizations.

Gartner defines Critical Capabilities as “attributes that differentiate products/services in a class in terms of their quality and performance.” Gartner rates each vendor’s product or service on a five-point (five points being best) scale in terms of how well it delivers each capability. Critical Capabilities shows you which products are best for each use case and includes a comparison graph for each, along with in-depth descriptions of the various points of comparison.

The study highlights 12 vendors Gartner considers most significant in this software sector and evaluates them against 13 critical capabilities in three use cases (data center environments, cloud environments, and edge environments). The critical capabilities include:

  • Scalability
  • Efficiency
  • Performance
  • Manageability
  • User Experience
  • Ecosystem Integration
  • OS, Hypervisors, and Container Support
  • Application and File Storage Support
  • SaaS Application Support
  • Security
  • Ransomware Protection
  • Reporting and Analytics
  • DR and Orchestration

The editors at Solutions Review have read the report, available here, and pulled out the key takeaways.

Speed of Recovery is Crucial

With the increasing number and growing complexity of ransomware attacks, comprehensive protection with a focus on speed of recovery from large-scale data loss is a key driving factor for clients consider replacing their existing backup and recovery platforms. Additionally, as data volumes continue to grow rapidly and applications become more dynamic, slow backup performance has become a primary concern for many businesses. 

Pros and Cons of BaaS

The growing utilization of Backup as a Service (BaaS) offerings are increasing data center and edge support, but they are also often connected to a single cloud provider. Because of this, BaaS platforms can have limited support for workloads hosted in other clouds. However, in general, Backup as a Service is particularly important for cloud environment use cases. Baas platforms can act as cost-effective deployment models in the public cloud and have the ability to automatically tier older backup copies to low-cost archive storage.

Gartner Recommends Prioritizing Ransomware Protection

Whether user data is stored on-prem, in Software as a Service (SaaS) applications, or public cloud Infrastructure as a Service (IaaS) providers, Gartner suggests prioritizing protection and recovery from ransomware attacks when selecting new backup software. When choosing a new backup platform, users should consider requiring that the software automate disaster recovery for typical DR events and large-scale recovery from ransomware in order to combat these attacks.

Read the 2021 Gartner Critical Capabilities for Enterprise Backup and Recovery Software Solutions.

Download Link to Data Protection Vendor Map

The post 2021 Gartner Critical Capabilities for Enterprise Backup and Recovery Software Solutions: Key Takeaways appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
The 10 Coolest Backup and Disaster Recovery CEOs of 2021 https://solutionsreview.com/backup-disaster-recovery/the-coolest-backup-and-disaster-recovery-ceos/ Mon, 02 Aug 2021 18:55:42 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=4483 The editors at Solutions Review have examined the top vendors in the backup, DRaaS, and data protection spaces and compiled this list of the 10 coolest backup and disaster recovery CEOs of 2021. The chief executive officer (CEO) is at the top of the food chain within an organization. The CEO undertakes many responsibilities, such […]

The post The 10 Coolest Backup and Disaster Recovery CEOs of 2021 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
The 10 Coolest Backup and Disaster Recovery CEOs of 2021The editors at Solutions Review have examined the top vendors in the backup, DRaaS, and data protection spaces and compiled this list of the 10 coolest backup and disaster recovery CEOs of 2021.

The chief executive officer (CEO) is at the top of the food chain within an organization. The CEO undertakes many responsibilities, such as developing a strategy and direction and setting the precedent for their business’ principles, conduct, and culture. The chief executive is also responsible for building an executive leadership team and allocating funds to match the company’s goals and priorities. Some CEOs have even more on their plate, whether they are at the head of the top backup and disaster recovery companies or an emerging startup. Sometimes they can be responsible for more than just the traditional duties and can do anything from brewing coffee to marketing their product.

Solutions Review has compiled this list of the 10 coolest backup and disaster recovery CEOs based on a number of factors, including the company’s market share, growth trajectory, and the impact each individual has had on its presence in what is becoming the most competitive global software market. Some of the top backup and disaster recovery CEOs have been with their respective companies since day one while others are serial entrepreneurs. But no matter their background, each CEO brings a diversity of skills and a unique perspective to the table that allows their company to thrive. 

The Coolest Backup and Disaster Recovery CEOs of 2021

Tom Signorello, Arcserve

Tom SignorelloTom Signorello is the CEO of Arcserve. His responsibilities include establishing the company’s global strategy, driving value to achieve worldwide sales goals, and setting the strategic direction of its portfolio of solutions. Signorello has been in the tech industry for 23 years and over that time he has developed a track record of creating profitable revenue streams over multiple industry sectors. Before joining Arcserve in 2017, Signorello was CEO of global solutions and services provider, OnX. Prior to his time at OnX, Signorello ran the North American business for Diebold, overseeing over half of the company’s revenue. He also served as senior vice president of global government managed services at Xerox. Recently, Signorello led Arcserve in a merger with StorageCraft.

Learn more and compare products with the Solutions Review Buyer’s Guide for Backup and Disaster Recovery.

Mohit Aron, Cohesity

Mohit Aron 2020 headshotMohit Aron, the CEO and Founder of Cohesity, has more than 15 years of experience in building scalable, high-performance distributed systems, and has been credited with increasing the popularity of hyperconvergence. Aron founded Cohesity in 2013 and co-founded the data storage company, Nutanix, in 2009. Before co-founding Nutanix, Aron worked at Google as a lead developer on the Google File System engineering project. Aron’s success is evident, as shown by the World Economic Forum (WEF) naming Cohesity as one of the world’s 61 most promising Technology Pioneers 2018. Recently, Cohesity also raised $250 million in Series E funding and is currently valued at $3.7 billion.

Learn more and compare products with the Solutions Review Buyer’s Guide for Backup and Disaster Recovery.

Sanjay Mirchandani, Commvault

Sanjay MirchandaniSanjay Mirchandani is the CEO of Commvault, who succeeded long-time former CEO, Robert (Bob) Hammer, on February 5, 2019. Mirchandani has most recently served as the President and CEO of open-source configuration management software company, Puppet. During his time at Puppet, he grew the user base of the organization’s open-source and commercial solutions to more than 40,000 companies, including 75 percent of the Fortune 100. Additionally, Mirchandani expanded Puppet’s global presence by opening offices in Seattle, Singapore, Sydney, Tokyo, and Timisoara, Romania. Before that, he spent twenty years in senior leadership roles and Microsoft, VMware, and EMC Corp.

Learn more and compare products with the Solutions Review Buyer’s Guide for Backup and Disaster Recovery.

Michael Dell, Dell Technologies

Michael Dell headshotMichael Dell is chairman and CEO of tech industry giant, Dell Technologies. Boasting revenues of $94B and 158,000 team members, Dell Technologies is one of the largest IT companies in the world, providing services to large enterprises, small businesses, and consumers alike. Dell founded Dell Technologies in 1984 at the age of 19, and in 1992, became the youngest CEO to ever earn a ranking on the Fortune 500. Additionally, in 1999, he and his wife established the Michael & Susan Dell Foundation to increase opportunities for children growing up in urban poverty in the US, India, and South Africa. Michael Dell also serves as a member of the Technology CEO Council and Business Roundtable, as well as an executive committee member of the International Business Council.

Learn more and compare products with the Solutions Review Buyer’s Guide for Backup and Disaster Recovery.

Jaspreet Singh, Druva

Jaspreet Singh DruvaJaspreet Singh is the founder and CEO of Druva. Singh’s experience with product vision and general management has enabled Druva to be one of the faster-growing companies in the data protection and management market. As a result of Singh’s entrepreneurial spirit, Druva has now raised approximately $328 million in venture funding, as well as 4,000 customers globally. In founding Druva, Singh created the first and only cloud-native Data Management-as-a-Service (DMaaS) company, disrupting the traditional data protection market. Before starting Druva, Singh held foundational roles at both Veritas and Ensim Corp. Druva recently acquired SFApex for an undisclosed amount. In April of 2021, Druva also raised $147 million in Series H funding from a group of investors.

Learn more and compare products with the Solutions Review Buyer’s Guide for Backup and Disaster Recovery.

Simon Taylor, HYCU

Simon TaylorSimon Taylor, the CEO of HYCU, has over 15 years of experience in go-to-market strategy development, product marketing, and channel sales management for the tech industry. Prior to his time at HYCU, Taylor worked with companies including Comtrade Group, Forrester Research, Putnam Investments, and Omgeo, holding senior executive leadership positions at Comtrade Software and Comtrade Group. Taylor has been the CEO of HYCU since 2018, as well as a member of the Forbes Technology Council since 2020. Recently, HYCU earned $87.5 million in Series A funding from Acrew Capital and Bain Capital Ventures. 

Learn more and compare products with the Solutions Review Buyer’s Guide for Backup and Disaster Recovery.

Russell P. Reeder, Infrascale

Russell P. ReederRussell P. Reeder is the CEO of Infrascale. He has more than 25 years of experience as a tech, sales, product, and branding executive. Previously, Reeder led OVHcloud’s growth into the United States. OVHcloud then successfully launched in the US, acquired vCloud Air from VMware, and built two additional data centers, bringing the organization to a total of 30 data centers globally. Additionally, Reeder led the premium cloud hosting company, MediaTemple, where he managed the company’s global sales growth, brand, and strategic direction.

 

Learn more and compare products with the Solutions Review Buyer’s Guide for Backup and Disaster Recovery.

Bipul Sinha, Rubrik

Bipul Sinha headshotRubrik’s CEO and Co-Founder, Bipul Sinha, brings more than 20 years of experience building successful companies and products from the ground up. Prior to Sinha’s time at Rubrik, he was a partner at LightSpeed, where he invested in Nutanix, PernixData, and Numerify. He is currently a Venture Partner at LightSpeed. Sinha also serves as a founding investor and board member at Nutanix, as well as acting as the Board Chairman and Co-Founder of Confluera. Additionally, Sinha has held engineering positions at Oracle, American Megatrends, and IBM. In late 2020, Rubrik acquired Igneous.

 

Learn more and compare products with the Solutions Review Buyer’s Guide for Backup and Disaster Recovery.

William H. Largent, Veeam

William H Largent headshotWilliam H. Largent is the CEO and Chairman of the Board at Veeam. He has logged more than 30 years in operations and leadership roles at growth companies. Before his time as CEO and Chairman of the Board, Largent was Veeam’s Executive Vice President of Operations, as well as a board member. Additionally, Largent was previously the CEO of Applied Innovation Inc., a public company previously traded on the NASDAQ. Largent was also the CFO of Plug Power where he managed the business’ initial public offering and raised more than $150 million in equity. Recently, Veeam was acquired by Insight Partners for $5 billion.  

Learn more and compare products with the Solutions Review Buyer’s Guide for Backup and Disaster Recovery.

Greg Hughes, Veritas Technologies

Greg Hughes VeritasGreg Hughes is the CEO of Veritas. He has over 20 years of experience as an executive in enterprise software. Prior to joining Veritas, Hughes was the CEO of Serena Software. In his time there, Hughes led the successful turnaround and sale to Micro Focus. Additionally, he has held executive roles in technology investment firms Silver Lake Partners and HGGC. Hughes also served in a range of senior executive roles at Symantec, most recently as President of the $4 billion Enterprise Product Group. Hughes joined Symantec through the acquisition of Veritas, where he was executive vice president of Global Services. To extend its digital compliance portfolio, Veritas acquired Globanet in 2020 and HubStor in 2021. 

Learn more and compare products with the Solutions Review Buyer’s Guide for Backup and Disaster Recovery.

Download Link to Data Protection Vendor Map

The post The 10 Coolest Backup and Disaster Recovery CEOs of 2021 appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
What’s Changed: 2021 Gartner Magic Quadrant for Enterprise Backup and Recovery Software Solutions https://solutionsreview.com/backup-disaster-recovery/whats-changed-2021-gartner-magic-quadrant-for-enterprise-backup-and-recovery-software-solutions/ Wed, 21 Jul 2021 19:29:47 +0000 https://solutionsreview.com/backup-disaster-recovery/?p=4472 The editors at Solutions Review highlight what’s changed since the last iteration of Gartner’s Magic Quadrant for Enterprise Backup and Recovery Software Solutions and provide an analysis of the new report. Analyst house Gartner, Inc. has released its 2021 Magic Quadrant for Enterprise Backup and Recovery Software Solutions. The researcher’s view of the enterprise backup […]

The post What’s Changed: 2021 Gartner Magic Quadrant for Enterprise Backup and Recovery Software Solutions appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>
Whats Changed 2021 Gartner Magic Quadrant for Enterprise Backup and Recovery Software SolutionsThe editors at Solutions Review highlight what’s changed since the last iteration of Gartner’s Magic Quadrant for Enterprise Backup and Recovery Software Solutions and provide an analysis of the new report.

Analyst house Gartner, Inc. has released its 2021 Magic Quadrant for Enterprise Backup and Recovery Software Solutions. The researcher’s view of the enterprise backup and recovery software solution market is focused on transformational technologies rather than the market as it exists today. Gartner defines the market as follows: “backup and recovery software solutions are designed to capture a point-in-time copy (backup) of an enterprise workload and write the data out to a secondary storage device for the purpose of recovering this data in case of loss.”

According to Gartner, the market went through significant change during the last 24 months. The vendors named in this report mainly focused on centralized management; ransomware resilience, detection, and remediation; support for public cloud IaaS and PaaS backup; support for SaaS-based applications; tiering to the public cloud; recovery in the public cloud; NoSQL database backup; instant recovery of databases and virtual machines; container backup; and subscription licensing.

The key capabilities of a backup and recovery solution include backing up and recovering operating systems, files, databases, and applications in the on-prem data center; creating a copy of the backup in the same physical location as the production environment for quick operational recovery; assigning multiple backup and retention policies that align with an organization’s recovery objectives; and reporting the success and failure of backup and recovery tasks.

Buyers should note that this market offers a broad array of products for different requirements and preferences, typically dependent on the user persona. While the eye may begin analyzing this report in the Leaders quadrant, Gartner states that it is best to evaluate your specific needs when assessing vendors. They add, “Gartner does not recommend eliminating Niche Players from customer evaluations. Niche Players are specifically and consciously focused on a subsegment of the overall market, or they offer relatively broad capabilities without very large enterprise-scale or the overall success of competitors in other quadrants.”

Gartner adjusts its evaluation and inclusion criteria for Magic Quadrants as software markets evolve. Gartner notes that a vendor being dropped from a Magic Quadrant one year does not necessarily mean that the research firm has changed its view of the vendor. Instead, it may reflect a change in the market or a change of focus by a vendor. Actifio was dropped, as it no longer met the researcher’s inclusion criteria, while Druva, Micro Focus, and Zerto were all added to this year’s report. 

In this Magic Quadrant, Gartner evaluated the strengths and weaknesses of 13 providers that it considers most significant in the marketplace and provides readers with a graph (the Magic Quadrant) plotting the vendors based on their ability to execute and completeness of vision. The graph is divided into four quadrants: niche players, challengers, visionaries, and leaders. At Solutions Review, we read the report, available here, and pulled out the key takeaways.

As in last year’s edition of this report, the leaders’ quadrant is the most densely populated. Additionally, the vendors listed are largely the same, with the only exception being IBM, which was named a challenger this year. Veeam has won out with the highest ability to execute in 2021. Within the past year, the vendor has announced a range of new tools for Azure and GCP and significantly expanded its AWS backup capabilities. Additionally, Veeam acquired Kasten in October 2020, thereby expanding its portfolio to include container backup support. Placing second in the ability to execute, and one of Veeam’s closest competitors once again is Commvault. The vendor’s recent aggressive feature release cadence and rapid geographic expansion have ensured that Commvault Metallic is in a position to address users looking to transition to an as a Service model in hybrid infrastructure environments.

Veritas Technologies and Rubrik are also closely clustered near Veeam and Commvault in the leader quadrant. In the past year, Veritas has acquired HubStor, which specializes in the protection of SaaS applications. The provider also announced NetBackup Flex Scale, an integrated scale-out appliance reference architecture based on NetBackup 9.0. Rubrik’s placement as a leader is likely due to its Polaris SaaS platform providing centralized visibility of multiple Rubrik cluster deployments located either on-prem or in the cloud, protection of IaaS instances, SaaS applications, integrated security, and workflow management.

Rounding out the leader quadrant are Dell Technologies and Cohesity. Dell Technologies has recently made four major software updates to its PowerProtect Data Manager. These new enhancements include support for application-consistent backups in Kubernetes, including VMware Tanzu Kubernetes, SAP HANA backup; guest workload support for AWS, Azure, and Google Cloud Platform; and improvements to Oracle Database and VMware backup. Cohesity’s leader status could be attributed to the fact that its DataProtect service is reportedly easy to manage. Additionally, Cohesity differentiates itself by enabling users to leverage Helios to provide disaster recovery orchestration, ransomware detection, capacity management, issue root cause analysis, and case management.

Gartner naming IBM a challenger is one of the major changes in this iteration of the Magic Quadrant. In the 2020 edition of this report, there were no challengers, while this year, Arcserve and IBM take up space in this quadrant. IBM’s change in placement could be because the vendor depends on third-party vendors to address backup requirements for Microsoft SharePoint, Microsoft 365 Exchange Online, some NoSQL databases, Nutanix AHV VMs, OpenStack environments, hardware snapshot management, and bare metal recovery of operating systems. In last year’s report, Arcserve was a niche player, while this year it scored higher on its ability to execute, landing in the challenger section. This shift could be caused by the vendor’s robust anti-malware capabilities resulting from Arcserve’s partnership with Sophos.

The three niche players in this year’s Magic Quadrant are Unitrends, Micro Focus, and Zerto. Both Unitrends and Micro Focus are placed near the middle of this section, while Zerto is located close to the center of the complete graph. Unitrends is the only vendor in this section that was also a niche player last year, as Micro Focus and Zerto are both new vendors this year. In the past 12 months, Unitrends has announced the Recovery Series Gen 9 appliances with improved capacity and performance, as well as UniView, a platform that delivers centralized management of multiple Recover Series appliances and SaaS applications.

Micro Focus leads with Data Protector, which mainly protects workloads in physical and virtual environments. This platform is offered in Express and Premium editions, which are designed for virtual environments and virtual, physical, and integration with cloud environments, respectively. Zerto, the remaining niche player, offers a single platform that combines backup and disaster recovery functionality, thereby simplifying data protection architectures. Additionally, the provider’s CDP-based architecture allows for very low RPOs.

The final two vendors in this report are the visionaries: Druva and Acronis. While Druva is new to this report, Acronis was also named a visionary in last year’s iteration. In 2020, Acronis launched several new security features on both its on-prem and Backup as a Service (BaaS) platforms to improve malware protection and provide safe recovery of backup companies. The Acronis Cyber Protect Cloud platform also provides a comprehensive BaaS offering. Druva’s new capabilities released in the last year include a ransomware recovery service, integrated backup and archiving for network attached storage data, Azure Active Directory integration, and enhancements for Oracle and Microsoft SQL Server backup. Additionally, Druva acquired sfApex in 2020 to improve its Salesforce data protection capabilities.

Read Gartner’s Magic Quadrant for Enterprise Backup and Recovery Software Solutions.

Download Link to Data Protection Vendor Map

The post What’s Changed: 2021 Gartner Magic Quadrant for Enterprise Backup and Recovery Software Solutions appeared first on Best Backup and Disaster Recovery Tools, Software, Solutions & Vendors.

]]>