Browse Definitions :

Database management

Terms related to databases, including definitions about relational databases and words and phrases about database management.

3-T - MUL

  • 3-tier application architecture - A 3-tier application architecture is a modular client-server architecture that consists of a presentation tier, an application tier and a data tier.
  • 99.999 (Five nines or Five 9s) - In computers, 99.
  • active directory - Active Directory (AD) is Microsoft's proprietary directory service.
  • Amazon DynamoDB - Amazon DynamoDB is a fully managed NoSQL database service offered by AWS, designed to provide low latency and high performance for applications.
  • Amazon RDS (Relational Database Service) - Amazon Relational Database Service (RDS) is a managed SQL database service provided by Amazon Web Services (AWS).
  • artifact (software development) - An artifact is a byproduct of software development that helps describe the architecture, design and function of software.
  • Azure Data Studio (formerly SQL Operations Studio) - Azure Data Studio is a Microsoft tool, originally named SQL Operations Studio, for managing SQL Server databases and cloud-based Azure SQL Database and Azure SQL Data Warehouse systems.
  • Basic Assembler Language (BAL) - BAL (Basic Assembler Language) is a version of IBM's assembler language (sometimes called assembly language) for its System/360 and System/370 mainframe operating system.
  • C++ - C++ is an object-oriented programming (OOP) language that is viewed by many as the best language for creating large-scale applications.
  • CICS (Customer Information Control System) - CICS (Customer Information Control System) is middleware that sits between the z/OS IBM mainframe operating system and business applications.
  • cold backup (offline backup) - A cold backup is a backup of an offline database.
  • columnar database - A columnar database is a database management system (DBMS) that stores data in columns instead of rows.
  • conformed dimension - In data warehousing, a conformed dimension is a dimension that has the same meaning to every fact with which it relates.
  • CRM (customer relationship management) analytics - CRM (customer relationship management) analytics comprises all of the programming that analyzes data about customers and presents it to an organization to help facilitate and streamline better business decisions.
  • cryptographic nonce - A nonce is a random or semi-random number that is generated for a specific use.
  • customer data integration (CDI) - Customer data integration (CDI) is the process of defining, consolidating and managing customer information across an organization's business units and systems to achieve a "single version of the truth" for customer data.
  • customer segmentation - Customer segmentation is the practice of dividing a customer base into groups of individuals that have similar characteristics relevant to marketing, such as age, gender, interests and spending habits.
  • data - In computing, data is information that has been translated into a form that is efficient for movement or processing.
  • data abstraction - Data abstraction is the reduction of a particular body of data to a simplified representation of the whole.
  • data aggregation - Data aggregation is any process whereby data is gathered and expressed in a summary form.
  • data analytics (DA) - Data analytics (DA) is the process of examining data sets to find trends and draw conclusions about the information they contain.
  • data availability - Data availability is a term used by computer storage manufacturers and storage service providers to describe how data should be available at a required level of performance in situations ranging from normal through disastrous.
  • data center infrastructure management (DCIM) - Data center infrastructure management (DCIM) is the convergence of IT and building facilities functions within an organization.
  • data cleansing (data cleaning, data scrubbing) - Data cleansing, also referred to as data cleaning or data scrubbing, is the process of fixing incorrect, incomplete, duplicate or otherwise erroneous data in a data set.
  • Data Definition Language (DDL) - Data Definition Language (DDL) is used to create and modify the structure of objects in a database using predefined commands and a specific syntax.
  • data fabric - A data fabric is an architecture and software offering a unified collection of data assets, databases and database architectures within an enterprise.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data management as a service (DMaaS) - Data management as a service (DMaaS) is a type of cloud service that provides enterprises with centralized storage for disparate data sources.
  • data mart (datamart) - A data mart is a repository of data that is designed to serve a particular community of knowledge workers.
  • data mining - Data mining is the process of sorting through large data sets to identify patterns and relationships that can help solve business problems through data analysis.
  • data modeling - Data modeling is the process of creating a simplified diagram of a software system and the data elements it contains, using text and symbols to represent the data and how it flows.
  • data preprocessing - Data preprocessing, a component of data preparation, describes any type of processing performed on raw data to prepare it for another data processing procedure.
  • data profiling - Data profiling refers to the process of examining, analyzing, reviewing and summarizing data sets to gain insight into the quality of data.
  • data quality - Data quality is a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it's up to date.
  • data set - A data set is a collection of data that contains individual data units organized (formatted) in a specific way and accessed by one or more specific access methods based on the data set organization and data structure.
  • data source name (DSN) - A data source name (DSN) is a data structure containing information about a specific database to which an Open Database Connectivity (ODBC) driver needs to connect.
  • data splitting - Data splitting is when data is divided into two or more subsets.
  • data structures - A data structure is a specialized format for organizing, processing, retrieving and storing data.
  • data warehouse - A data warehouse is a repository of data from an organization's operational systems and other sources that supports analytics applications to help drive business decision-making.
  • database (DB) - A database is a collection of information that is organized so that it can be easily accessed, managed and updated.
  • database as a service (DBaaS) - Database as a service (DBaaS) is a cloud computing managed service offering that provides access to a database without requiring the setup of physical hardware, the installation of software or the need to configure the database.
  • database automation - Database automation is the use of unattended processes and self-updating procedures for administrative tasks in a database.
  • database management system (DBMS) - A database management system (DBMS) is system software for creating and managing databases, allowing end users to create, protect, read, update and delete data in a database.
  • database marketing - Database marketing is a systematic approach to the gathering, consolidation and processing of consumer data.
  • database normalization - Database normalization is intrinsic to most relational database schemes.
  • database replication - Database replication is the frequent electronic copying of data from a database in one computer or server to a database in another -- so that all users share the same level of information.
  • Db2 - Db2 is a family of database management system (DBMS) products from IBM that serve a number of different operating system (OS) platforms.
  • deep analytics - Deep analytics is the application of sophisticated data processing techniques to yield information from large and typically multi-source data sets comprised of both unstructured and semi-structured data.
  • denormalization - Denormalization is the process of adding precomputed redundant data to an otherwise normalized relational database to improve read performance of the database.
  • dimension - In data warehousing, a dimension is a collection of reference information that supports a measurable event, such as a customer transaction.
  • distributed database - A distributed database is a database that consists of two or more files located in different sites either on the same network or on entirely different networks.
  • distributed ledger technology (DLT) - Distributed ledger technology (DLT) is a digital system for recording the transaction of assets in which the transactions and their details are recorded in multiple places at the same time.
  • document-oriented database - A document-oriented database is a type of NoSQL database in which data is stored in binary document files.
  • Dublin Core - Dublin Core is an international metadata standard formally known as the Dublin Core Metadata Element Set and includes 15 metadata (data that describes data) terms.
  • ebXML (Electronic Business XML) - EbXML (Electronic Business XML or e-business XML) is a project to use the Extensible Markup Language (XML) to standardize the secure exchange of business data.
  • Eclipse (Eclipse Foundation) - Eclipse is a free, Java-based development platform known for its plugins that allow developers to develop and test code written in other programming languages.
  • employee self-service (ESS) - Employee self-service (ESS) is a widely used human resources technology that enables employees to perform many job-related functions, such as applying for reimbursement, updating personal information and accessing company benefits information -- which was once largely paper-based, or otherwise would have been maintained by management or administrative staff.
  • encoding and decoding - Encoding and decoding are used in many forms of communications, including computing, data communications, programming, digital electronics and human communications.
  • encryption key management - Encryption key management is the administration of tasks involved with protecting, storing, backing up and organizing encryption keys.
  • Entity Relationship Diagram (ERD) - An entity relationship diagram (ERD), also known as an entity relationship model, is a graphical representation that depicts relationships among people, objects, places, concepts or events within an information technology (IT) system.
  • Excel - Excel is a spreadsheet program from Microsoft and a component of its Office product group for business applications.
  • extension - An extension typically refers to a file name extension.
  • failover - Failover is a backup operational mode in which the functions of a system component are assumed by a secondary component when the primary becomes unavailable.
  • field - A field is an area in a fixed or known location in a unit of data such as a record, message header, or computer instruction that has a purpose and usually a fixed size.
  • file extension (file format) - In computing, a file extension is a suffix added to the name of a file to indicate the file's layout, in terms of how the data within the file is organized.
  • flat file - A flat file is a collection of data stored in a two-dimensional database in which similar yet discrete strings of information are stored as records in a table.
  • foreign key - A foreign key is a column or columns of data in one table that refers to the unique data values -- often the primary key data -- in another table.
  • framework - In general, a framework is a real or conceptual structure intended to serve as a support or guide for the building of something that expands the structure into something useful.
  • Google BigQuery - Google BigQuery is a cloud-based big data analytics web service for processing very large read-only data sets.
  • Google Bigtable - Google Bigtable is a distributed, column-oriented data store created by Google Inc.
  • graph analytics - Graph analytics is a category of software tools and data mining techniques that help an analyst understand the relationship between entries in a graph database.
  • graph database - A graph database is a type of NoSQL database that uses graph theory to store, map and query relationships.
  • Hadoop Distributed File System (HDFS) - The Hadoop Distributed File System (HDFS) is the primary data storage system used by Hadoop applications.
  • hashing - Hashing is the process of transforming any given key or a string of characters into another value.
  • IBM IMS (Information Management System) - IBM IMS (Information Management System) is a database and transaction management system that was first introduced by IBM in 1968.
  • in-database analytics - In-database analytics is a scheme for processing data within the database, avoiding the data movement that slows response time.
  • in-memory database - An in-memory database is a type of analytic database designed to streamline the work involved in processing queries.
  • information - Information is stimuli that has meaning in some context for its receiver.
  • ISAM (Indexed Sequential Access Method) - ISAM (Indexed Sequential Access Method) is a file management system that allows records to be accessed either sequentially or randomly.
  • Java Database Connectivity (JDBC) - Java Database Connectivity (JDBC) is an API packaged with the Java SE edition that makes it possible to connect from a Java Runtime Environment (JRE) to external, relational database systems.
  • JDBC driver - A JDBC driver (Java Database Connectivity driver) is a small piece of software that allows JDBC to connect to different databases.
  • job scheduler - A job scheduler is a computer program that enables an enterprise to schedule and, in some cases, monitor computer 'batch' jobs (units of work).
  • job step - In certain computer operating systems, a job step is part of a job, a unit of work that a computer operator (or a program called a job scheduler) gives to the operating system.
  • JOLAP (Java Online Analytical Processing) - JOLAP (Java Online Analytical Processing) is a Java application-programming interface (API) for the Java 2 Platform, Enterprise Edition (J2EE) environment that supports the creation, storage, access, and management of data in an online analytical processing (OLAP) application.
  • key-value pair (KVP) - A key-value pair (KVP) is a set of two linked data items: a key, which is a unique identifier for some item of data, and the value, which is either the data that is identified or a pointer to the location of that data.
  • knowledge base - In general, a knowledge base is a centralized repository of information.
  • knowledge management (KM) - Knowledge management is the process by which an enterprise gathers, organizes, shares and analyzes its knowledge in a way that is easily accessible to employees.
  • Lisp (programming language) - Lisp, an acronym for list processing, is a functional programming language that was designed for easy manipulation of data strings.
  • managed file transfer (MFT) - Managed file transfer (MFT) is a type of software used to provide secure internal, external and ad-hoc data transfers through a network.
  • MariaDB - MariaDB is an open source relational database management system (DBMS) that is a compatible drop-in replacement for the widely used MySQL database technology.
  • Microsoft Office SharePoint Server (MOSS) - Microsoft Office SharePoint Server (MOSS) is the full version of a portal-based platform for collaboratively creating, managing and sharing documents and Web services.
  • Microsoft SQL Server - Microsoft SQL Server is a relational database management system, or RDBMS, that supports a wide variety of transaction processing, business intelligence and analytics applications in corporate IT environments.
  • Microsoft SQL Server Management Studio (SSMS) - Microsoft SQL Server Management Studio (SSMS) is an integrated environment to manage a SQL Server infrastructure.
  • Microsoft SSIS (SQL Server Integration Services) - Microsoft SSIS (SQL Server Integration Services) is an enterprise data integration, data transformation and data migration tool built into Microsoft's SQL Server database.
  • Microsoft Visual FoxPro (Microsoft VFP) - Microsoft Visual FoxPro (VFP) is an object-oriented programming environment with a built-in relational database engine.
  • middleware - Middleware is software that is used to bridge the gap between applications and operating systems.
  • MongoDB - MongoDB is an open source NoSQL database management program.
  • MPP database (massively parallel processing database) - An MPP database is a database that is optimized to be processed in parallel for many operations to be performed by many processing units at a time.
  • multidimensional database (MDB) - A multidimensional database (MDB) is a type of database that is optimized for data warehouse and online analytical processing (OLAP) applications.
  • multimodel database - A multimodel database is a data processing platform that supports multiple data models, which define the parameters for how the information in a database is organized and arranged.
Networking
  • top-of-rack switching

    Top-of-rack switching is a data center architecture design in which computing equipment like servers, appliances and other ...

  • edge device

    An edge device is any piece of hardware that controls data flow at the boundary between two networks.

  • Transmission Control Protocol (TCP)

    Transmission Control Protocol (TCP) is a standard that defines how to establish and maintain a network conversation by which ...

Security
  • Zoombombing

    Zoombombing is a type of cyber-harassment in which an unwanted and uninvited user or group of such users interrupts online ...

  • CISO (chief information security officer)

    The CISO (chief information security officer) is a senior-level executive responsible for developing and implementing an ...

  • cyber attack

    A cyber attack is any malicious attempt to gain unauthorized access to a computer, computing system or computer network with the ...

CIO
  • globalization

    Globalization is the process by which ideas, knowledge, information, goods and services spread around the world.

  • business process outsourcing (BPO)

    Business process outsourcing (BPO) is a business practice in which an organization contracts with an external service provider to...

  • localization

    Localization is the process of adapting and customizing a product to meet the needs of a specific market, as identified by its ...

HRSoftware
  • employee resource group (ERG)

    An employee resource group is a workplace club or more formally realized affinity group organized around a shared interest or ...

  • employee training and development

    Employee training and development is a set of activities and programs designed to enhance the knowledge, skills and abilities of ...

  • employee sentiment analysis

    Employee sentiment analysis is the use of natural language processing and other AI techniques to automatically analyze employee ...

Customer Experience
  • customer profiling

    Customer profiling is the detailed and systematic process of constructing a clear portrait of a company's ideal customer by ...

  • customer insight (consumer insight)

    Customer insight, also known as consumer insight, is the understanding and interpretation of customer data, behaviors and ...

  • buyer persona

    A buyer persona is a composite representation of a specific type of customer in a market segment.

Close