Ransomware encrypts files so quickly that entire file systems can be compromised before organizations are even aware of a problem.
MEDIA 7: You have had an expansive career in diverse fields ranging from operations management, business development, sales and marketing, to process improvements. How has all this experience helped in your current role at Panzura?
GLEN SHOK: I’ve been in the enterprise and software sectors for more than two decades, working at companies like Oracle, EMC, and Cisco to build industry partnerships and define market strategies for software-defined storage, data networking, data centers, and of course the cloud. I’ve seen a lot of change but when it comes to using and consuming data, the enterprise has always been moving towards less cost, more availability, and more agility, and all of this with less risk.
For my role at Panzura, this experience has given me a perspective on how business has accelerated towards the cloud. It’s about delivering on-demand IT services that are elastic and able to scale up and down when needed. I know that the cloud can do a lot of things, but without the driving forces of data management and data security, none of that is possible. As I see it, success will come from helping the enterprise simultaneously address both the IT imperative of hybrid-cloud, and the business imperative of multi-cloud, and doing it in a way that keeps data safe.
Panzura makes data available so it can be used securely, and at a much lower cost. We also increase the value of that data by enabling real-time collaboration, integrating data from many different silos, and making it easy to analyze and manage, all on a shared common platform. Getting back to the main drivers of the move to the cloud–elasticity, simplicity and security–we’re seeing that any interface for cloud storage needs to support these things for a multitude of diverse business cases. That includes banking and financial markets, healthcare and hospitals, construction, engineering, manufacturing–the list goes on.
The natural evolution of that is taking shape with Panzura, where organizations with people all around the world will be able to utilize any kind of storage, and the data that is placed in them as if it’s a single data center. As a part of that, they’ll also be able to discover the capabilities of the cloud and use this interface to manage that, too. They’ll do all of this with complete confidence that their data is protected, secure, and resilient to any type of failure. It’s that evolution and innovation that has me most excited, and it’s why Panzura is a natural fit for my experience.
M7: How does Panzura’s immutable data architecture offer a solution to ransomware attacks?
GS: Great question! I mentioned the fact that the enterprise has always been moving toward less risk, and the cloud has made that far more complicated. Legacy file systems were already inherently vulnerable to ransomware and other types of malware because the data held in them needs to be editable. When attacked, they do exactly what they are designed to do, which is to allow files to be changed.
Ransomware, once inside the network, can gain access to these systems and change or encrypt data making it inaccessible. Everything grinds to a halt until a ransom is paid. When you get hit, recovering “clean” files after an attack is exceptionally difficult and time-consuming. Traditional backup processes tend to run on a scheduled basis, so there is almost always a time gap that results in data loss, and restoring from a backup is time and labor-intensive.
Panzura CloudFS global file system is built in a way that makes it impossible for attackers to alter or change data. Data is resistant to attack altogether. Instead of fighting off intruders at the castle wall, it reduces the impact and recovery time after an attack by keeping data unaffected.
Read More: 'Make the customer the hero,' says Chris Kim, Director of Social Media at Airtable
M7: How does Panzura CloudFS improve data security and backup processes?
GS: From the attacker’s point of view, the success or failure of an attack depends on your ability to restore access to your data, unless the ransom is paid. That’s why they often go after backups first, to limit your recovery options. This leaves you with nothing but offsite backups to restore your data, and as I mentioned, that is a very slow and costly process. While all of this is happening, users are locked out of their files. Each hour that goes by is lost time and money. But our approach to data immutability, and the ability of our hybrid-cloud solution to encrypt data and make it completely useless to attackers, means Panzura CloudFS users don’t worry about downtime and never pay ransoms.
Let me explain! Panzura CloudFS stores file data as blocks in cloud object storage, as a single authoritative data set that every user in the organization works from. It makes no difference how many people, or how far apart they are. Every user gets what feels like a local file experience, even though the data itself is stored hundreds, if not thousands of miles away. Those data blocks are immutable, stored in a ‘Write Once, Read Many’ form, so that once stored, they cannot be changed, edited, or overwritten. That makes them impervious to all forms of malware.
Metadata pointers are used to record which blocks comprise a file at any given time. As users create or edit files, changed data chunks are moved to object storage every 60 seconds, and are stored as new data blocks. At the same time, those pointers are updated to reflect any new blocks that form the file. For example, if a file is composed of blocks A, B, C and D, and it is edited today, it might now be composed of blocks A, B, C and E. The new block E is moved to the object store, and the pointers record that A, B, C and E are required to open the current version of that file.
These immutable data blocks are further protected by system-wide read-only snapshots. These are essentially exact replicas of the data and are taken at configurable intervals–typically no more than every 60 minutes. They keep files consistent while they’re being worked on. For backup purposes, additional read-only snapshots are taken every 60 seconds, and these are used to transfer changed data to the object-store. As these are read-only, snapshots, they are impervious to ransomware, and they provide a way to restore data to any previous version in a very precise way.
Legacy file systems were already inherently vulnerable to ransomware and other types of malware because the data held in them needs to be editable.
M7: How does Panzura enable quick progress towards realistic digital transformations for organizations using AI and data analytics?
GS: Panzura CloudFS has a built-in SaaS-based data analytics layer called Panzura Data Services. It allows users to apply cloud-based AI and ML analytics across many types of unstructured data. We partner with the leading cloud and managed service providers to deliver optimized, fit-for-purpose solutions that make it faster and more efficient to handle massive volumes of data files. These files can be stored, retrieved, searched, and analyzed using AI in compliance with the specific regulatory requirements of financial institutions, healthcare, and other key sectors.
Panzura allows organizations to migrate or re-platform data, workloads, and applications to the cloud, and to consolidate data across multiple on-premises servers and the cloud, without having to refresh existing infrastructure. Applications and data can be moved as-is to a public cloud, while enhancing or replacing some components to take advantage of cloud services, which are becoming more driven by AI, without rewrite or workflow changes.
As I mentioned before, the Panzura global file system is built on a unique immutable architecture that delivers the highest level of data protection and recovery of any solution in its category. Granular recovery capabilities are augmented by the powerful analytics of Panzura Data Services–which is also becoming more AI-driven–to identify and restore files near-instantaneously in the event of data loss, damage, or ransomware attack.
Read More: 'Enterprises that don't leverage AI and ML is likely at a disadvantage,' believes Miquido's Jerzy Biernacki
M7: What makes Panzura’s cloud data management platform stand out among its competitors?
GS: While there are similar solutions out there, they are better for small implementations across just a few sites. None achieve real-time global file consistency or anything close to it. Those solutions typically sync data to central storage. Local filers then pull data from that storage, once it is available. The time to achieve file consistency is therefore dependent on the time taken for each location to sync changes to the cloud store, and for the querying location to retrieve it. While lower-level solutions are often good for legacy storage, they still leave users waiting on data. This significantly impacts productivity and produces data bloat with multiple, redundant but still out-of-sync file versions that make data difficult to manage.
Panzura moves data in real-time, as it is created and when it is required. All locations in the Panzura global file system sync changed data to the cloud simultaneously, every 60 seconds. Should another location need to open a file before this occurs, a peer-to-peer connection handles the change in file ownership, as well as any changed data blocks which make the file consistent. This happens in milliseconds in the background.
No other solution provides built-in analytics for integrated search, audit, and file network analysis over an entire cloud storage infrastructure. Some even require indexing to be switched off, making it painfully slow to find files. This also means they cannot offer AI- and ML-based performance and activity alerts, and they generally don’t have tools for admins to fully diagnose and troubleshoot their environment.
M7: What do you consider to be the best practices for streamlining data infrastructure?
GS: We come at this from a couple of angles. Panzura’s hybrid-cloud approach makes files immediately consistent across sites and provides enterprise-grade durability without replicating files for backup and disaster recovery. Legacy approaches house user files and replicate them to a secondary site, so users from both sites have access to the same files. In this scenario, the company investment is already twice the original storage investment to satisfy the RPO and remote collaboration requirements.
Instead of replicating files across locations, Panzura uses public, private or dark cloud storage as a single authoritative data source. Virtual machines at the edge, on-premises, or in cloud regions, overcome latency by holding the file system’s metadata as well as intelligently caching the most frequently used files to achieve local-feeling performance.
Much has been written about the exponential growth of unstructured data. Much less has been said about how legacy approaches to file systems and data management contribute to that increase.
Cloud providers automatically create redundant copies of data across different repositories and locations by building durability into their service. When you consider that unstructured data is already growing at an incredible pace, adding more storage to compensate is not only complicated and costly but doesn’t solve the real problem. Under the strain of this load, most organizations quickly find that their existing storage systems are cumbersome at best. Traditional network-attached storage–or NAS–is plagued by insufficient workflows and performance, not to mention security risks.
The inefficiency and inflexibility create delays in access to files which makes it impossible to work productively, and that problem is even worse when multiple people in different locations collaborate on files. Along with delays, people may end up editing files that are already being edited by another employee. These legacy NAS systems may have virtually no backend tech managing file edits, leaving duplicate files to run rampant.
As a result, each site is sharing with the wider enterprise their mess of siloed, redundant data. The spread of mismanaged data amounts to a stockpile of badly maintained clutter. It’s a lot of work to navigate, consolidate, and tolerate these flaws.
Panzura CloudFS uses a single, authoritative data set as the “golden copy” of each file. Dynamic caching based on usage patterns keeps duplicates at bay. What’s more, it uses file sync methods alongside intelligent and conservative file locking, to keep the storage space tidy and duplicate-free.
Read More: 'No company operating in the cloud should ever lose data' says Sam Gutmann, CEO and Co-Founder at OwnBackup
Traditional network-attached storage–or NAS–is plagued by insufficient workflows and performance, not to mention security risks.
M7: Panzura CloudFS is set to replace legacy solutions. How do you see this segment growing as the threat from ransomware increases?
GS: Well, I think the threat of ransomware is going to continue to grow alongside other novel malware tactics like data wiper exploits. We’re already seeing this unfold in very disturbing ways with state-sponsored cyber-crime now emerging as a threat that could easily dwarf rogue criminals out to make a profit from holding data ransom. In many ways, I see this as the final nail in the coffin for legacy solutions. The need for more and better data management solutions to contain these threats has shifted the landscape, which was already moving toward the cloud as we discussed.
For one thing, we’re seeing demand for self-service ransomware recovery capabilities that make it easier for organizations to take data recovery and restoration into their own hands when an attack happens. One way we’re addressing this is to roll out self-managed snapshot recovery that lets IT teams revert files or directories to a pre-ransomware state without the need for external support. If we can trim downtime by shaving off minutes with these types of capabilities, there’s the potential to save entire industries billions of dollars in lost time due to delayed work, and millions of IT man-hours.
M7: Few years down the line, how do you see AI changing the world around us?
GS: Looking into my crystal ball, in a few years, I’m certain we’ll begin to see AI algorithms that are self-learning and when unleashed on an organization’s data network, will quickly learn data usage patterns, and begin shifting cloud capacity around automatically to make the entire network move faster. We’ll also see predictive AI serving up data and files to the right people, when and where it’s needed, anticipating the workflows of an organization in real-time.
But I think of this in terms of how ransomware and cyberwarfare are compressing timelines and bringing innovations that would have otherwise taken years to happen. Ransomware encrypts files so quickly that entire file systems can be compromised before organizations are even aware of a problem. Solving this problem is an area where Panzura is breaking new ground right now. We’re getting ready to introduce new AI-powered features that provide early detection and confirmation of ransomware attacks, and then send notifications via text and emails.
That way, administrators will be alerted to ransomware attacks in near real-time so they can organize a quick response, cutting down on downtime and disruption, and even reducing recovery efforts. So, in many ways, the future is now, and that’s very exciting. But what comes next, for example, is self-healing cloud data management systems with AI at the helm, and the ability to connect entire industries into living data ecosystems–that’s even more incredible.