Cybersecurity & Data Education Hub

Cybersecurity, compliance, and data management terms can become an overwhelming “alphabet soup.” The DataBee® Cybersecurity & Data Education Hub regularly updates with new resources to help you make sense of them all.

Security Data Fabric

What is a Security Data Fabric?

A security data fabric is a data fabric architecture that integrates and manages security data from various sources in a unified, secure, and governed approach. It is designed to navigate complex security semantics, streamlining workflows for security, risk, and compliance.

What is a Security Data Mesh?

A security data mesh, also referred to as a cybersecurity data mesh, is a decentralized data architecture focusing on security data from various sources where business domains own and operate their data. This data management design enables organizations.

What is a Security Data Lake?

A security data lake (SDL) is a centralized repository for storing all types of security data in its raw, unstructured or structured format. They are a scalable, cost-effective solution for storing and analyzing the large volumes of security telemetry that their environments generate every day.

What is a Security Data Maturity Model?

The Security Data Maturity Model (SDMM), developed by the cybersecurity team at Comcast, is a prescriptive, five stage approach to how organizations can use and leverage security data to make data-driven security decisions. This conceptual framework provides strategic value to help organizations better assess their security status and risk profile.

Cybersecurity

What is the Open Cybersecurity Schema Framework?

The Open Cybersecurity Schema Framework (OCSF) is a collaborative, open-source project delivering a vendor-agnostic standardized database structure that streamlines security operations and makes it easier for security teams to do their jobs.

What are Sigma Rules?

Sigma rules are stored in the YAML file format to allow for treating detections as code. The YAML files – which stands for Yet Another Markup Language – use a human-readable format to provide a shared language that defenders can use to detect threats in any environment.

What is Network Detection and Response (NDR)?

Network Detection and Response (NDR) is a solution focused on detecting and responding to anomalous system activities and behaviors in network traffic data. NDRs can analyze raw network packets or traffic metadata from within or between internal (east-west) and external (north-south) communication. 

Data Management

What is Data Federation?

Data federation is an architectural implementation in which the storage functionality (e.g. create, read, update and delete, also known as CRUD) for an application is delegated to a remote data repository which manages the storage and responds to data usage instructions from the application.

What is Data Fabric Federation?

Data fabric federation is an architecture approach that centralizes data from various sources, often using an extract, transform, and load (ETL) process, and normalizes the data between many domains for correlated insights and queries. 

What is a Data Catalog?

A data catalog is an inventory or directory of an organization’s data assets, often used to locate data, to identify and understand data attributes, and to streamline how datasets can be used by users for on-demand access to analytics models. Metadata, which is structured and descriptive information that identifies data’s attributes, is used to power the data catalog. 

What is Data Lineage?

Data lineage is the process of tracking – and ideally documenting – the journey of data over time. This begins from its creation at the source, includes various transformations as it moves through data pipelines, workflow engines, and ETL/ELT processes, and ends at the final application.

What is Auto-Parsing?

The act of parsing, also called syntax analysis or syntactic analysis, is the process of using a technology called a parser to extract data elements from structured, semi-structured, and unstructured data across divergent formats to convert the data into a consistent format. Auto-parsing uses specifically developed parsers to automatically breakdown data into a format that is easier for analysis.

What is the difference between ELT and ETL?

At a basic level, the two data integration methods differ in the order and location of the data transformation step during the data integration process. Both methods are designed to move data from its source or sources to its destinations, however the changing of the order can impact costs, change the tools and underlying infrastructure.

What is Security Telemetry?

Telemetry is technical raw data, often scattered like the 1000 pieces in a puzzle box. When disaggregated, the individual data points make no sense. When aggregated and correlated, they can give a comprehensive picture of the organization’s security program. Aggregating, correlating, and analyzing this data enables operations, security, and compliance teams.

What is Entity Resolution?

Entity resolution is the process of identifying and correlating information or records of real people, devices, and applications so that security teams can correlate information from various tools across any security event, regardless of log source.

Compliance

What is Compliance?

Compliance consists of the internal policies, processes, and activities a company implements and maintains to ensure it follows externally defined rules. Compliance management is the monitoring an organization does to ensure that workforce members follow the internally designed policies, procedures, rules, and behavioral standards.

What is Governance, Risk, and Compliance (GRC)?

Governance, Risk, and Compliance (GRC) is the set of policies, processes, and associated technologies that organizations use to align IT, privacy, and cybersecurity strategies to their business objectives. Organizations use GRC to assess the risks and benefits of adopting business-enabling and revenue-enhancing technologies.