data storage requirements

Results 1 - 25 of 33Sort Results By: Published Date | Title | Company Name
Published By: Infinidat EMEA     Published Date: May 14, 2019
Continuous data availability is a key business continuity requirement for storage systems. It ensures protection against downtime in case of serious incidents or disasters and enables recovery to an operational state within a reasonably short period. To ensure continuous availability, storage solutions need to meet resiliency, recovery, and contingency requirements outlined by the organization.
Tags : 
    
Infinidat EMEA
Published By: Broadcast Beat Magazine     Published Date: Oct 30, 2019
After examining the TCO of object storage vs. LTO tape, and seeing the benefits achieved in a real-world case study, it’s easy to understand why object storage is quickly becoming the standard for data storage, backup, archive, and recovery. It’s clear that when all costs associated with data storage are tracked, object storage beats tape storage hands down. And even with costs aside, on-premises object storage with Cloudian provides businesses with the flexibility they need to respond to changing technology environments, storage requirements, data monetization opportunities, and more.
Tags : 
    
Broadcast Beat Magazine
Published By: Oracle     Published Date: Aug 09, 2018
The purpose of IT backup and recovery systems is to avoid data loss and recover quickly, thereby minimizing downtime costs. Traditional storage-centric data protection architectures such as Purpose Built Backup Appliances (PBBAs), and the conventional backup and restore processing supporting them, are prone to failure on recovery. This is because the processes, both automated and manual, are too numerous, too complex, and too difficult to test adequately. In turn this leads to unacceptable levels of failure for today’s mission critical applications, and a poor foundation for digital transformation initiatives. Governments are taking notice. Heightened regulatory compliance requirements have implications for data recovery processes and are an unwelcome but timely catalyst for companies to get their recovery houses in order. Onerous malware, such as ransomware and other cyber attacks increase the imperative for organizations to have highly granular recovery mechanisms in place that allow
Tags : 
    
Oracle
Published By: Dell EMC     Published Date: Nov 10, 2015
No matter how advanced data centers may become, they remain in a perpetual state of change in order to meet the demands of virtualized environments. But with the advent of software-defined storage (SDS) architecture, capabilities associated with hyperconverged technologies (including compute, storage, and networking), help data centers meet virtualization requirements with less administrator intervention at webscale.
Tags : 
    
Dell EMC
Published By: EMC Corporation     Published Date: Jul 07, 2013
While the concept of big data is nothing new, the tools and technology and now in place for companies of all types and sizes to take full advantage. Enterprises in industries such as media, entertainment, and research and development have long been dealing with data in large volumes and unstructured formats - data that changes in near real time. However, extracting meaning from this data has been prohibitive, often requiring custom-built, expensive technology. Now, thanks to advancements in storage and analytics, all organizations can leverage big data to gain the insight needed to make their businesses more agile, innovative, and competitive.
Tags : 
big data, emc, technology, storage, analytics, data management, security, knowledge management, platforms, business technology
    
EMC Corporation
Published By: EMC Corporation     Published Date: Feb 13, 2015
XtremIO all-flash-arrays (AFAs) have redefined everything you know about SQL Server database infrastructures. Through a ground-breaking, fresh approach to storage design, XtremIO is uniquely engineered for SQL Server database requirements utilizing a powerful and vastly simplified scale-out performance architecture, with in-memory always-on compression, deduplication and space efficient copy services enabling application acceleration, consolidation and agility.
Tags : 
all-flash-arrays, database infrastructures, sql server, database requirements, application acceleration, data management
    
EMC Corporation
Published By: IBM APAC     Published Date: Jul 19, 2019
It’s important to understand the impact of AI workloads on data management and storage infrastructure. If you’re selecting infrastructure for AI workloads involving ML and deep learning, you must understand the unique requirements of these emerging workloads, especially if you’re looking to use them to accelerate innovation and agility. This Gartner report highlights three main impacts that AI workloads have on data management and storage.
Tags : 
    
IBM APAC
Published By: NetApp     Published Date: Sep 18, 2014
The NetApp flash portfolio is capable of solving database performance and I/O latency problems encountered by many database deployments. The majority of databases have a random I/O workload that creates performance problems for spinning media, but is well-suited for today’s flash technologies. NetApp has a diverse enterprise-class flash portfolio consisting of flash in the storage controller (Flash Cache™ intelligent caching), flash within the disk shelves (Flash Pool™ intelligent caching), and all-flash arrays (EF-Series and All-flash FAS). This portfolio can be used to solve complex database performance requirements at multiple levels within a customer’s Oracle environment. This document reviews Oracle database observations and results when implementing flash technologies offered within the NetApp flash portfolio.
Tags : 
database performance, database deployment, flash technology, enterprise techology
    
NetApp
Published By: IBM APAC     Published Date: Mar 19, 2018
Unstructured data has exploded in volume over the past decade. Unstructured data, media files and other data can be created just about anywhere on the planet using almost any smart device available today. As the amount of unstructured data grows exponentially, customers using this data need to be able to take advantage of the right storage solutions to support all of their file and object data requirements. IBM® recently added a new storage system to their Spectrum product family, IBM Spectrum Network Attached Storage (NAS). IBM Spectrum NAS adds another software-defined file storage system to IBM’s current unstructured data storage solutions, IBM Spectrum Scale™ and IBM Cloud Object Storage (COS). Below, we will discuss the three systems and supply some guidance on when and where to use each of them.
Tags : 
    
IBM APAC
Published By: NetApp     Published Date: Dec 10, 2014
IT leaders are required to deliver improved performance and increase efficiencies while maintaining an enterprise-class infrastructure for their customer’s database environments. A complete portfolio consisting of flash in the storage controller, flash in disk shelves, and all-flash arrays is the best way to solve complex database performance requirements at multiple levels in your environment. The NetApp flash portfolio is capable of solving database performance and I/O latency problems encountered by many database deployments.
Tags : 
it leaders, flash, database environments, netapp, infrastructure, it management
    
NetApp
Published By: HP     Published Date: Nov 05, 2014
IT has never been more important to doing business, which means that IT infrastructure must be simpler, smarter, faster, more flexible, and more business-aligned than ever. New service delivery models are driving new Tier-1 storage requirements that reveal how the traditional focus on performance and availability are clearly not enough to support virtualization, ITaaS, and new cloud service delivery models. The world is moving rapidly towards a New Style of IT, and will leave behind any business that doesn’t adapt even more rapidly. Is your storage ready? Download this whitepaper now.
Tags : 
data center, change management, data management, storage space, storage criteria, storeserv, tier-1, market reportit requirements, asset utilization, data mobility, maintenance upgrades, it management, enterprise applications
    
HP
Published By: Lenovo and Intel®     Published Date: Oct 31, 2018
Digital transformation is putting data at the core of business processes, necessitating a rethink of IT infrastructure. This new business environment is driving new requirements for performance, scalability, availability, and flexibility in data center storage. In this IDC white paper, learn about evolving enterprise infrastructure requirements and learn more about the Lenovo and NetApp strategic relationship, including IDC's take on the value this delivers to both customers and the vendors themselves.
Tags : 
    
Lenovo and Intel®
Published By: Lenovo     Published Date: Nov 01, 2018
Digital transformation is putting data at the core of business processes, necessitating a rethink of IT infrastructure. This new business environment is driving new requirements for performance, scalability, availability, and flexibility in data center storage. In this IDC white paper, learn about evolving enterprise infrastructure requirements and learn more about the Lenovo and NetApp strategic relationship, including IDC's take on the value this delivers to both customers and the vendors themselves.
Tags : 
    
Lenovo
Published By: Lenovo and Intel®     Published Date: Oct 14, 2016
Everything you need to know about Infrastructure for Desktop Virtualization—in one eBook. Dive into this extensive eBook to get all the details you need to consider when launching down the path of virtualization. In this eBook, from Brian Suhr, author of the blogs Data Center Zombie and VirtualizeTips, and editor Sachin Chheda, director of solutions and verticals marketing at Nutanix, we provide detailed analysis and key points to consider, including: • Architectural Principles • Building Blocks • Infrastructure Alternatives • Storage Requirements • Compute Sizing Get the eBook
Tags : 
infrastructure, capacity, monitoring, storage requirements, compute sizing, scalability, business technology
    
Lenovo and Intel®
Published By: Trillium Software     Published Date: May 19, 2011
By implementing the six pillars of data quality optimization, your organization can incrementally improve the quality of the data that drives all your operations.
Tags : 
trillium software, data quality management, iqm, information quality management, quality maturity model, enterprise data management, edm, data storage requirements, digital universe, data hygiene best practices, data standards, data monitoring
    
Trillium Software
Published By: NexGen     Published Date: Feb 09, 2015
The nature of the financial services industry places a myriad of international compliance requirements on a company's IT team, as well as an expectation by its customers to deliver the high test levels of performance and reliability. To survive and thrive, businesses in the industry must not only keep pace with customer demand but gain competitive advantage. Those demands mean the IT team must be at the forefront of adopting emerging technologies. This is certainly true for Orangefield Columbus, who recently experienced significant growth in its multiple databases which led to the serious performance degradation of its existing storage system. By focusing on a proactive data management storage array, Orangefield was able to eliminate resource contention. Download now and examine Orangefield's journey to find a solution that would meet, and exceed, their performance and capacity requirements.
Tags : 
nexgen, vmware, citrix, flash, financial services, it management
    
NexGen
Published By: IBM Corp     Published Date: Sep 23, 2011
The ever-increasing pace of data growth calls for a comprehensive storage strategy that protects data infrastructure from performance problems and is also flexible enough to handle changing business requirements. Read the advisory brief from Frost & Sullivan's Stratecast group and learn how IBM's suite of storage solutions addresses big data challenges. You'll also learn how you can introduce scalability and efficiency to your storage environment - avoiding the storage Armageddon.
Tags : 
technology, storage, esg, capacity, ibm
    
IBM Corp
Published By: IBM     Published Date: Jul 25, 2012
As it has been the trend over the last decade, organizations must continue to deal with growing data storage requirements with the same or less resources. The growing adoption of storage-as-a-service, business intelligence, and big data results in ever more Service Level Agreements that are difficult to fulfill without IT administrators spending ever longer hours in the data center. Many organizations now expect their capital expense growth for storage to be unstoppable, and see operating expense levers - such as purchasing storage systems that are easy to manage - as the only way to control data storage-related costs.
Tags : 
infrastructure, technology, cloud, storage, virtualization, data management, business technology
    
IBM
Published By: Box     Published Date: Jan 16, 2015
The nature of the financial services industry places a myriad of international compliance requirements on a company's IT team, as well as an expectation by its customers to deliver the high test levels of performance and reliability. To survive and thrive, businesses in the industry must not only keep pace with customer demand but gain competitive advantage. Those demands mean the IT team must be at the forefront of adopting emerging technologies. This is certainly true for Orangefield Columbus, who recently experienced significant growth in its multiple databases which led to the serious performance degradation of its existing storage system. By focusing on a proactive data management storage array, Orangefield was able to eliminate resource contention. Download now and examine Orangefield's journey to find a solution that would meet, and exceed, their performance and capacity requirements.
Tags : 
nexgen, vmware, citrix, flash, financial services, it management
    
Box
Published By: SnapLogic     Published Date: Aug 17, 2015
This report summarizes the changes that are occurring, new and emerging patterns of data integration, as well as data integration technology that you can buy today that lives up to these new expectation
Tags : 
data integration, cloud computing, mass data storage, integration requirements, integration strategies, non-persisted data streaming, device native data, data encryption
    
SnapLogic
Published By: IBM     Published Date: Feb 18, 2009
This white paper explains how the seven basic principles for managing enterprise application data can help your organization: Establish effective policies for full-lifecycle enterprise data management to control data growth and lower storage costs; Meet service level goals to achieve the timely completion of key business processes for mission-critical applications; Support data retention compliance initiatives and mitigate risk for audits and e-discovery requests; Implement scalable archiving strategies that easily adapt to your ongoing business requirements.
Tags : 
ibm integrated data management, siebel crm, archiving project, service level goals, data retention compliance, archiving strategies, ibm optim data growth solution, rapid data growth, data classification, enterprise applications, data management
    
IBM
Published By: IBM     Published Date: Feb 18, 2009
This white paper explains how the seven basic principles for managing enterprise application data can help your organization: Establish effective policies for full-lifecycle enterprise data management to control data growth and lower storage costs; Meet service level goals to achieve the timely completion of key business processes for mission-critical applications; Support data retention compliance initiatives and mitigate risk for audits and e-discovery requests; Implement scalable archiving strategies that easily adapt to your ongoing business requirements.
Tags : 
ibm integrated data management, oracle e-business suite, business-critical erp applications, archiving project, service level goals, data retention compliance, archiving strategies, ibm optim data growth solution, rapid data growth, data classification, enterprise applications, data management
    
IBM
Published By: IBM     Published Date: Feb 18, 2009
This white paper explains how the seven basic principles for managing enterprise application data can help your organization: Establish effective policies for full-lifecycle enterprise data management to control data growth and lower storage costs; Meet service level goals to achieve the timely completion of key business processes for mission-critical applications; Support data retention compliance initiatives and mitigate risk for audits and e-discovery requests; Implement scalable archiving strategies that easily adapt to your ongoing business requirements.
Tags : 
ibm integrated data management, jd edwards, business-critical erp applications, archiving project, service level goals, data retention compliance, archiving strategies, ibm optim data growth solution, rapid data growth, data classification, enterprise applications, data management
    
IBM
Published By: Pillar Data Systems     Published Date: Apr 20, 2010
In this brief 23-minute on-demand Webinar, opinion leaders from Pillar Data Systems and industry experts from featured analyst firm, Gartner, Inc., break down the challenges that today's organizations face and help them to select a flexible storage platform that will adapt to changing business and application requirements while minimizing risks and reducing management complexity.
Tags : 
pillar data systems, modern storage infrastructure, it infrastructure, productivity, tco, green it, data
    
Pillar Data Systems
Published By: Arcserve     Published Date: Feb 26, 2015
In typical organizations, most data resides outside the data center, so it is important that the protection of desktop and laptop computers is given the same priority as file servers and application servers. Have you deployed the right data protection strategy for endpoints? We’re here to help! Arcserve UDP offers a FREE Workstation edition product that specifically focuses on backing up data on endpoints. Not only can desktops and laptops be protected for FREE with award-winning technology that minimizes bandwidth and storage requirements, but they can participate in the global deduplication schema offered by UDP (for 30 days), and have their data protected in public and private clouds, and more! This is too good to pass up! Get your FREE Arcserve UDP Workstation edition now.
Tags : 
arcserve, unified data protection, software download, bandwidth, deduplication, it management, knowledge management, enterprise applications
    
Arcserve
Previous   1 2    Next    
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.

“I am the Inspector Morse of IT journalism. I haven't a clue. D'oh” - Mike Magee