data capacity

Results 1 - 25 of 183Sort Results By: Published Date | Title | Company Name
Published By: Spectrum Enterprise     Published Date: Oct 29, 2018
Bandwidth. Speed. Throughput. These terms are not interchangeable. They are interrelated concepts in data networking that help measure capacity, the time it takes to get from one point to the next and the actual amount of data you’re receiving, respectively. When you buy an Internet connection from Spectrum Enterprise, you’re buying a pipe between your office and the Internet with a set capacity, whether it is 25 Mbps, 10 Gbps, or any increment in between. However, the bandwidth we provide does not tell the whole story; it is the throughput of the entire system that matters. Throughput is affected by obstacles, overhead and latency, meaning the throughput of the system will never equal the bandwidth of your Internet connection. The good news is that an Internet connection from Spectrum Enterprise is engineered to ensure you receive the capacity you purchase; we proactively monitor your bandwidth to ensure problems are dealt with promptly, and we are your advocates across the Internet w
Tags : 
    
Spectrum Enterprise
Published By: VMware     Published Date: Dec 10, 2018
As agencies continue to modernize data center infrastructure to meet evolving mission needs and technologies, they are turning to agile software and cloud solutions. One such solution is hyper-converged infrastructure (HCI), a melding of virtual compute, storage, and networking capabilities supported by commodity hardware. With data and applications growing exponentially along with the need for more storage capacity and flexibility, HCI helps offset the rising demands placed on government IT infrastructure. HCI also provides a foundation for hybrid cloud, helping agencies permanently move applications and workloads into public cloud and away from the data center.
Tags : 
    
VMware
Published By: Upsite Technologies     Published Date: Sep 18, 2013
The average computer room today has cooling capacity that is nearly four times the IT heat load. Using data from 45 sites reviewed by Upsite Technologies, this white paper will show how you can calculate, benchmark, interpret, and benefit from a simple and practical metric called the Cooling Capacity Factor (CCF). Calculating the CCF is the quickest and easiest way to determine cooling infrastructure utilization and potential gains to be realized by AFM improvements.
Tags : 
ccf, upsite technologies, cooling capacity factor, energy costs, cooling, metrics, practical, benchmark
    
Upsite Technologies
Published By: Hewlett Packard Enterprise     Published Date: May 11, 2018
Very little data is available on how effectively enterprises are managing private cloud deployments in the real world. Are they doing so efficiently, or are they facing challenges in areas such as performance, TCO and capacity? Hewlett Packard Enterprise commissioned 451 Research to explore these issues through a survey of IT decision-makers and data from the Cloud Price Index.
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: May 11, 2018
If your business is like most, you are grappling with data storage. In an annual Frost & Sullivan survey of IT decision-makers, storage growth has been listed among top data center challenges for the past five years.2 With businesses collecting, replicating, and storing exponentially more data than ever before, simply acquiring sufficient storage capacity is a problem. Even more challenging is that businesses expect more from their stored data. Data is now recognized as a precious corporate asset and competitive differentiator: spawning new business models, new revenue streams, greater intelligence, streamlined operations, and lower costs. Booming market trends such as Internet of Things and Big Data analytics are generating new opportunities faster than IT organizations can prepare for them.
Tags : 
    
Hewlett Packard Enterprise
Published By: Dell EMC     Published Date: Nov 08, 2016
Time-to-market, consolidation, and complexity struggles are a thing of the past. Join yourpeers in database storage nirvana with the Dell EMC All-Flash portfolio, powered by Intel® Xeon® processors.
Tags : 
database, consolidation, capacity, storage, complexity
    
Dell EMC
Published By: Oracle CX     Published Date: Oct 19, 2017
The Software in Silicon design of the SPARC M7 processor, and the recently announced SPARC S7 processor, implement memory access validation directly into the processor so that you can protect application data that resides in memory. It also includes on-chip Data Analytics Accelerator (DAX) engines that are specifically designed to accelerate analytic functions. The DAX engines make in-memory databases and applications run much faster, plus they significantly increase usable memory capacity by allowing compressed databases to be stored in memory without a performance penalty. The following Software in Silicon technologies are implemented in the SPARC S7 and M7 processors: Note: Security in Silicon encompasses both Silicon Secured Memory and cryptographic instruction acceleration, whereas SQL in Silicon includes In-Memory Query Acceleration and In-Line Decompression. Silicon Secured Memory is the first-ever end-to-end implementation of memory-access validation done in hardware. It
Tags : 
    
Oracle CX
Published By: Dell and Nutanix     Published Date: Jan 16, 2018
Because many SQL Server implementations are running on virtual machines already, the use of a hyperconverged appliance is a logical choice. The Dell EMC XC Series with Nutanix software delivers high performance and low Opex for both OLTP and analytical database applications. For those moving from SQL Server 2005 to SQL Server 2016, this hyperconverged solution provides particularly significant benefits.
Tags : 
data, security, add capacity, infrastructure, networking, virtualization, dell
    
Dell and Nutanix
Published By: Digital Realty     Published Date: Feb 25, 2015
When measuring competitive differentiation in milliseconds, connectivity is a key component for any financial services company’s data center strategy. In planning the move of its primary data center, a large international futures and commodities trading company needed to find a provider that could deliver the high capacity connectivity it required.
Tags : 
financial services, trade processing, data center, connectivity, data center, it management, data management, business technology
    
Digital Realty
Published By: Dell EMC     Published Date: May 12, 2016
Businesses face greater uncertainty than ever. Market conditions, customer desires, competitive landscapes, and regulatory constraints change by the minute. So business success is increasingly contingent on predictive intelligence and hyperagile responsiveness to relentlessly evolving demands. This uncertainty has significant implications for the data center — especially as business becomes pervasively digital. IT has to support business agility by being more agile itself. It has to be able to add services, scale capacity up and down as needed, and nimbly remap itself to changes in organizational structure.
Tags : 
    
Dell EMC
Published By: Dell Brought to you by Intel     Published Date: Dec 09, 2013
Database performance and memory capacity with the Intel Xeon Processor E5-2660V2- Powered Dell Poweredge M620.
Tags : 
dell, xeon processors e5-2660, database performance, intel xeon processor, poweredge m620., software development, it management
    
Dell Brought to you by Intel
Published By: Dell EMC     Published Date: Aug 17, 2017
For many companies the appeal of the public cloud is very real. For tech startups, the cloud may be their only option, since many don’t have the capital or expertise to build and operate the IT systems their businesses need. Existing companies with established data centers are also looking at public clouds, to increase IT agility while limiting risk. The idea of building-out their production capacity while possibly reducing the costs attached to that infrastructure can be attractive. For most companies the cloud isn’t an “either-or” decision, but an operating model to be evaluated along with on-site infrastructure. And like most infrastructure decisions the question of cost is certainly a consideration. In this report we’ll explore that question, comparing the cost of an on-site hyperconverged solution with a comparable set up in the cloud. The on-site infrastructure is a Dell EMC VxRailTM hyperconverged appliance cluster and the cloud solution is Amazon Web Services (AWS).
Tags : 
public cloud, it systems, data center, it agility, hyperconverged solution, hyperconverged appliance
    
Dell EMC
Published By: Butler Technologies     Published Date: Jul 03, 2018
MPO connectors increase your data capacity with a highly efficient use of space. But users have faced challenges such as extra complexities and time required for testing and troubleshooting multi-fiber networks. VIAVI helps overcome these challenges with the industry's most complete portfolio of test solutions for MPO connectivity.
Tags : 
    
Butler Technologies
Published By: Dell Storage     Published Date: Apr 17, 2012
A scale-out storage architecture helps organizations deal with demands for growing data capacity and access. Dell engineers put DellT EqualLogicT scale-out storage through its paces to demonstrate its scalability in both file and block I/O scenarios.
Tags : 
storage
    
Dell Storage
Published By: HP     Published Date: Jan 18, 2013
Today enterprises are more dependent on robust, agile IT solutions than ever before. It’s not just about technology—people and processes need to make the cloud journey too, and to realize the benefit of new technology, new support is needed.
Tags : 
data center, hp data center care, flexible capacity service, service, capacity, flexible, it management, data management, business technology
    
HP
Published By: PernixData     Published Date: Jun 01, 2015
Storage arrays are struggling to keep up with virtualized data centers. The traditional solution of buying more capacity to get more performance is an expensive answer – with inconsistent results. A new approach is required to more cost effectively provide the storage performance you need, when and where you need it most.
Tags : 
pernixdata, sql, database, servers, architecture, data management
    
PernixData
Published By: Carbonite     Published Date: Apr 09, 2018
Global data deduplication provides important benefits over traditional deduplication processes because it removes redundant data through entire enterprises, not just single devices. Global deduplication increases the data deduplication ratio—the size of the original data measured against the size of the data store after redundancies are removed. This helps reduce the amount of storage required at a time when businesses face exponential storage growth. Chief benefits of global deduplication include: • Reductions in storage of up to 60% • The most optimal deduplication ratio • Enterprise-wide reach • Massive reductions in backup-related WAN traffic By shrinking storage capacity needs, data deduplication can cut storage costs quickly. At the same time, businesses today need to access and utilize their data in real time, making the most recent and relevant information available. By eliminating redundant data, deduplication technology makes it simpler for data to be managed across various b
Tags : 
    
Carbonite
Published By: IBM     Published Date: May 02, 2013
The enormous volume, velocity and variety of data flooding the enterprise, along with the push for analytics and business intelligence, is creating a massive challenge that is overwhelming traditional storage approaches. As the demand for capacity continues to escalate, companies must be able to effectively and dynamically manage the storage supply, but also the demand for storage resources. The key is to optimize the infrastructure through standardization and virtualization, and replace manual tasks with policy-based automation.
Tags : 
optimize, storage, efficient, data center, analytics, business, virtualization
    
IBM
Published By: IBM     Published Date: May 02, 2013
The explosion in IT demand has intensified pressure on data center resources, making it difficult to respond to business needs, especially while budgets remain flat. As capacity demands become increasingly unpredictable, calculating the future needs of the data center becomes ever more difficult. The challenge is to build a data center that will be functional, highly efficient and cost-effective to operate over its 10-to-20-year lifespan. Facilities that succeed are focusing on optimization, flexibility and planning—infusing agility through a modular data center design.
Tags : 
modular, data center, efficient, optimization, flexibility, cost-effective
    
IBM
Published By: Dell EMC     Published Date: Aug 22, 2017
Dell EMC Isilon® scale-out NAS is the ideal platform to store, manage, protect and analyze your unstructured data. Isilon is the only platform that scales capacity and performance in minutes – allowing you to infinitely consolidate unstructured data, cut costs, and gain new levels of agility and insight to accelerate your business.
Tags : 
    
Dell EMC
Published By: SAP     Published Date: Nov 22, 2017
Many energy and natural resource (ENR) companies still rely on static, error prone data sources. Using static data in assumptions that are made on actual production costs, operating capacity, and yields is no longer sufficient to compete successfully in today’s market, a situation made more problematic given that the ENR industry faces reduced market prices for products, increased costs for operations and an onslaught of new competitors.
Tags : 
companies, operating capacity, sufficient, successfully, market, enr, industry
    
SAP
Published By: PC World     Published Date: Jul 02, 2012
Data Storage Strategies and insights for growing businesses.
Tags : 
data growth, dell, pc world, capacity expansion
    
PC World
Published By: NetApp     Published Date: Sep 30, 2013
"Today’s data centers are being asked to do more at less expense and with little or no disruption to company operations. They’re also expected to run 24x7, handle numerous new application deployments and manage explosive data growth. Data storage limitations can make it difficult to meet these stringent demands. Faced with these challenges, CIOs are discovering that the “rip and replace” disruptive migration method of improving storage capacity and IO performance no longer works. Access this white paper to discover a new version of NetApps storage operating environment. Find out how this software update eliminates many of the problems associated with typical monolithic or legacy storage systems."
Tags : 
storage infrastructure, clustered storage, technology, scalability, application, storage solution, storage, data center
    
NetApp
Published By: NetApp     Published Date: Sep 21, 2017
This white paper is an overview of how SolidFire enables efficient data distribution and management to maximize system capacity without experiencing performance downgrades in an economical manner.
Tags : 
netapp, database performance, flash storage, data management, cost challenges, solidfire, data
    
NetApp
Published By: NetApp     Published Date: Sep 21, 2017
In the current landscape of modern data centers, IT professionals are stretched too thin. Triage situations are the norm and tend to reduce the time spent on strategic business objectives. This paper offers a solution to this IT dilemma, outlining the ways to achieve a storage infrastructure that enables greater performance and capacity.
Tags : 
netapp, database performance, flash storage, data management, cost challenges, all-flash
    
NetApp
Start   Previous   1 2 3 4 5 6 7 8    Next    End
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.

“I am the Inspector Morse of IT journalism. I haven't a clue. D'oh” - Mike Magee