Incorrect information generated by an algorithm and presented as fact.
AIOps is the analysis of alerts and events, with automated IT operational responses, to maintain the health, uptime, and performance of services and solutions.
Anomaly analysis uses AI to identify the normal operating workload range for an application and alerts you when it detects changes. It can also flag an potential malicious activity, like ransomware.
Ansible is a configuration management platform that automates storage, servers, and networking.
Artificial intelligence (AI) is the basis for mimicking human intelligence processes through the creation and application of algorithms built into a dynamic computing environment. Stated simply, AI is trying to make computers think and act like humans.
Automation is a process by which routine workflows and processes—scheduling, monitoring, maintenance, application delivery, and so on—are managed and executed without human administration.
Backup as a service (BaaS) is a method of offsite data storage in which files, folders, or the entire contents of a hard drive are regularly backed up by a service vendor to a remote secure cloud-based data repository over a network connection.
Backup and recovery describes the process of creating and storing copies of data that can be used to protect organizations against data loss.
Big data analytics is the process of examining large and varied datasets to uncover hidden patterns, unknown correlations, market trends, customer preferences, and other useful information that can help organizations make better informed business decisions.
Block storage is a storage scheme in which each volume acts as a separate hard drive, configured by the storage administrator.
Continuous integration (CI) and continuous delivery (CD) are two approaches to software development that are designed to improve code quality and enable rapid delivery and deployment of code. They are usually deployed together (CI/CD) to ensure rapid overall delivery of new software features and fixes.
BlueXP's Data Classification is an AI-driven toolkit that automatically scans, analyzes, and categorizes your data for enhanced governance and privacy.
Cloud analytics describes the application of analytic algorithms in the cloud against data in a private or public cloud to then deliver a result of interest. Cloud analytics involves deployment of scalable cloud computing with powerful analytic software to identify patterns in data and to extract new insights.
Cold data refers to infrequently-accessed data.
Computer vision is a field of AI that trains computers to capture and interpret information from image and video data.
Configuration and resource management is an automated method for maintaining computer systems and software in a known, consistent state.
Containers are a form of operating system virtualization. A single container might be used to run anything from a small microservice or software process to a larger application.
Converged Infrastructure (CI) is a convergence of compute, storage, and networking infrastructure in the data center.
Data fabric seamlessly connects private, public, and hybrid cloud environments.
A data lake can store and large amounts of structured and unstructured data in their native formats.
Data Obfuscation is an AI solution that helps privatize sensitive data.
A data pipeline is the collection of data infrastructure (software and supporting hardware) needed to collect, prepare, and manage data for an AI algorithm.
Data Sprawl is the concept of exponential growth of data and data sources across an organization's data estate.
A data warehouse is a type of data repository that centralizes many sources of data in an organization.
Deep learning is a branch of machine learning. Unlike traditional machine learning algorithms, many of which have a finite capacity to learn no matter how much data they acquire, deep learning systems can improve their performance with access to more data: the machine version of more experience.
DevOps is a philosophy and framework that encourages efficent development and faster release of new or revised software features or products to customers.
A digital twin is a virtual model of a real world object or process that can be used to simulate real-world conditions in real time.
Edge Computing refers to the ability of devices to compute, process, and analyze data with a high level of quality without latency
Flash storage is a data storage technology based on high-speed, electrically programmable memory.
GPUs are typically used for graphic visualization (rendering) by performing repetitive arithmetic calculations. This repetitive compute capability is often used for AI and deep learning use cases.
High performance computing (HPC) is the ability to process data and perform complex calculations at high speeds.
Hybrid cloud refers to a mixed computing, storage, and services environment made up of on-premises infrastructure, private cloud services, and a public cloud—such as Amazon Web Services (AWS) or Microsoft Azure—with orchestration among the various platforms.
Infrastructure as Code (IaC) is an approach to managing data center server, storage, and networking infrastructure. With IaC, infrastructure configuration information is housed in standardized files, which can be read by software that maintains the state of the infrastructure. IaC can improve productivity and reliability because it eliminates manual configuration steps.
AI inference is the process during which new data is run through a trained AI model in order to make new predictions or complete a new task.
The IoT is a network of connected devices and sensors that can monitor, collect, and exchange data about the behavior of “things” in the real world or "edge", like smart watches and industrial equipment.
Kubernetes is a containerization platform that makes it easier to deploy and operate containers at scale.
A large language model is a type of machine learning model that uses deep learning to train on enormous amounts of data.
A subset of artificial intelligence (AI), machine learning (ML) is the area of computational science that focuses on analyzing and interpreting patterns and structures in data to enable learning, reasoning, and decision making outside of human interaction.
Microservices are an architectural approach to creating cloud applications. Each application is built as a set of services, and each service runs in its own processes and communicates through APIs.
Multicloud also refers to the distribution of cloud assets, software, and applications across several cloud environments, using multiple cloud computing platforms to support a single application or ecosystem of applications that work together in a common architecture.
A NAS (network-attached storage) system is a file-dedicated storage device attached to a network that allows data to be accessible from multiple authorized users and devices.
An application of Machine Learning, NLP trains computer programs to understand human speech.
A neural network is a machine learning technique that uses interconnected nodes, a structure inspired by the human brain.
You can use the non-volatile memory express (NVMe) protocol to provide storage in a SAN environment. The NVMe protocol is optimized for performance with solid state storage.
Object storage, also known as object-based storage, is a strategy that manages and manipulates data storage as distinct units, called objects. These objects are kept in a single storehouse and are not ingrained in files inside other folders.
computing services offered by providors like Azure, AWS, and Google.
Ransomware is any software that allows an outsider to access and encrypt another’s files, delete the originals, and then threaten to delete the only remaining (encrypted) copy of the files if the ransom isn't paid.
A storage area network (SAN) consists of a storage solution connected to hosts over a SAN transport protocol such as iSCSI or FC. You can configure your SAN so that your storage solution attaches to your hosts through one or more switches.
Site reliability engineering (SRE) is a discipline to create ultra-scalable and reliable software systems by applying software engineering practices to infrastructure and operations problems.
Software-defined storage (SDS) enables users and organizations to uncouple or abstract storage resources from the underlying hardware platform for greater flexibility, efficiency and faster scalability by making storage resources programmable.
Storage as a Service
Structured data can be thought of as records (or transactions) in a database environment; for example, rows in a table of a SQL database.
Unstructured simply means that it is datasets (typical large collections of files) that aren't stored in a structured database format.
A virtualization solution that uses virtual machines to provide and manage virtual desktops.
VMware Cloud Foundation (VCF) is a hybrid cloud platform for both traditional enterprise and modern applications.
The concept of Zero Trust envisions network security from the inside-out rather than from the outside-in.