Glossary

Here's some terminology for Skyone Studio:

Term
Meaning & Usage

parameter

It refers to the values that must be supplied to the operations so that they can be executed. The definition of parameters only takes place within the definition of operations.

The term "parameters", commonly used for various designations, will be replaced by samples, variables and execution data according to the context.

<>parameter</>

sample

These are the sample values we provide when we create a parameter. The sample is used to populate the parameter during the build and test phase of the operation.

variables

These are the current flow, integration and context parameters. Variables are created at configuration time, can be passed to operation parameters at runtime and can be updated by the Update Parameters module. Variables are classified according to their scope of availability: Flow, Integration and Context.

[]identifier : variable/]

execution data

This is all the data received from connector modules or generated during the execution of a flow.

{}module : data_name{/}

Ad Hoc Reports (Relatórios Ad Hoc)

Custom reports generated on demand by users for specific needs.

Autonomous Agents

AI agents capable of performing tasks and making decisions without constant human supervision.

Workforce Agents

AI agents designed to work alongside human teams, automating tasks and increasing productivity.

High Adaptability

The ability of AI agents to quickly adjust to changes in their environment.

API (Application Programming Interface)

A set of rules and specifications that software programs follow to communicate with each other.

API Gateway

A management tool for APIs that sits in front of an API and acts as a single entry point for a defined group of APIs.

CIF Architecture (Corporate Information Factory)

A data warehouse architecture model proposed by Bill Inmon, focusing on structured, centralized corporate data.

Automation

The use of technology to perform tasks with minimal human assistance.

BI (Business Intelligence)

The process of analyzing data to provide actionable insights that help organizations make informed business decisions.

Big Data

Large volumes of data characterized by the 5 Vs: Volume, Velocity, Variety, Value, and Veracity. Requires advanced technologies for storage and processing.

Plug-and-Play Capability

Integration design that works flexibly with various agents, allowing for easy implementation and adaptation to business needs.

Data Load

The process of inserting data into a data warehouse, which can be either initial or incremental.

Data Cleaning

Part of the ETL process responsible for removing inconsistencies, duplicates, and invalid data.

Collaboration

Interaction and cooperation between human users and AI agents.

Data Collection

The process of gathering and measuring information on variables of interest to answer research questions, test hypotheses, and evaluate outcomes.

Goal-Oriented Behavior

The ability of AI agents to make decisions based on context in order to align their actions with strategic business goals.

Cloud Computing

The delivery of computing services—including servers, storage, databases, networking, software, analytics, and intelligence—over the Internet (“the cloud”) to offer faster innovation, flexible resources, and economies of scale.

Connectors

Pre-built integrations that enable the iPaaS layer to connect with various applications and data sources.

Consumption

The use of AI agents and their capabilities.

Data Integration

The process of harmonizing, transforming, and moving data between different systems.

Data Lake

A centralized repository that allows you to store all your structured and unstructured data at any scale.

Data Mart

A subset of a data warehouse focused on a specific business area.

Data Warehouse

A central repository of integrated data from one or more disparate sources. It stores current and historical data in one place and is used for analytical reporting across an organization.

DataOps

A methodology applying DevOps principles to data management, aiming at automation and collaboration.

Data Democratization

The process of making data accessible to everyone in an organization for data-driven decision-making.

Dimension

A data warehouse structure used to categorize and describe facts, such as time, location, and products.

DSS (Decision Support System)

Systems designed to assist in decision-making.

EIS (Executive Information System)

Systems designed to provide strategic information to managers and executives.

ELT (Extract, Load, Transform)

An alternative to ETL, where data is loaded before being transformed.

Data Enrichment

The process of enhancing raw data by adding related information or context.

Contextual Understanding

An AI agent’s ability to interpret the meaning of user input based on surrounding information and past interactions.

ETL (Extract, Transform, Load)

A process that extracts data from various sources, transforms it, and loads it into a data warehouse.

Fact

Quantitative information stored in a data warehouse, used for analysis, such as sales and profits.

Agentic Workflow

An automated process orchestrated by AI agents to achieve specific goals.

Customizable Workflows

AI workflows that can be adapted to meet specific business requirements.

Agent Workforce

A team of AI agents working collaboratively to achieve complex goals.

Data Governance

The overall management of the availability, usability, integrity, and security of data in an enterprise system.

Governance and Orchestration

The framework that manages the autonomy of AI agents, regulating their access to systems and ensuring secure operations aligned with organizational policies.

GPU (Graphics Processing Unit)

A specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device.

Skills

Specific capabilities or functions that can be performed by AI agents.

AI (Artificial Intelligence)

The capability of a machine to mimic intelligent human behavior.

Agentic/Autonomous AI

AI agents that operate independently using generative AI to reduce human intervention and optimize processes.

GenAI (Generative AI)

A type of AI that can create new and original content—such as text, images, music, or code—in response to user prompts or inputs. It uses advanced machine learning models, such as deep neural networks, to analyze and understand patterns in large datasets, allowing it to generate results often indistinguishable from human work.

GenAI Applications

  • Content Creation: GenAI can be used to create written content such as articles, blog posts, poems, and even scripts for films or plays.

  • Image Generation: It can generate realistic or abstract images based on text descriptions, opening new possibilities for digital art and design.

  • Software Development: GenAI can assist in writing code, suggesting snippets, autocompleting lines, and even identifying bugs.

  • Customer Service: Chatbots powered by GenAI can provide more natural and personalized responses, enhancing user experience.

  • Research and Development: GenAI can simulate scenarios and generate synthetic data, accelerating R&D in fields like medicine and engineering.

GenAI Benefits

  • Creativity Boost: GenAI can help unlock creativity by providing new ideas and inspiration.

  • Efficiency Improvement: It can automate repetitive, time-consuming tasks, freeing up time for more strategic and creative activities.

  • Personalization: GenAI can generate personalized content and experiences for each user, increasing engagement and satisfaction.

  • Innovation: It can drive new discoveries and advancements across various fields, fostering innovation and progress.

GenAI Challenges

  • Bias and Discrimination: GenAI models may perpetuate biases in training data, producing unfair or harmful outcomes.

  • Plagiarism and Copyright: GenAI raises concerns about plagiarism and copyright, as it can generate content resembling others’ work.

  • Misinformation: GenAI can be used to create and spread misinformation, with serious consequences for society.

  • Ethics and Responsibility: The use of GenAI brings ethical and accountability concerns, requiring careful consideration of its impact and implications.

IDP (Intelligent Document Processing)

A technology that uses AI to automatically extract and process information from documents.

Data Ingestion

The process of importing data from various sources into a data storage system.

Inmon (Bill Inmon)

One of the originators of the data warehouse concept, advocating for centralized data structuring.

Integration

The process of connecting different systems and data sources to work together.

Hybrid Integration

The ability to connect and integrate systems and data across both on-premises and cloud environments.

Data Export Interoperability

A system’s ability to easily export and integrate data with other platforms and systems.

iPaaS (Integration Platform as a Service)

A suite of cloud-based services that enables the development, execution, and governance of integration flows connecting various applications, data, and processes.

Context Window

The token limit that a large language model (LLM) can process at one time.

Kimball (Ralph Kimball)

Creator of a data warehouse methodology based on data marts and dimensional modeling.

Lakehouse

A data management system that combines the features of a data lake and a data warehouse, offering scalability and flexibility along with structure and governance.

LLM (Large Language Model)

An advanced AI model trained on large amounts of text data, capable of understanding, generating, and interpreting natural language with high accuracy.

Private LLM

A large language model hosted in a secure and private environment.

LMM (Large Multimodal Model)

An AI model similar to LLMs but capable of processing and generating multiple types of data, such as text, images, and audio.

Master Data Management (MDM)

A strategy to standardize and manage master data across an organization.

Metabase

An open-source business intelligence tool that allows users to explore and visualize data.

Metadata

Data that describes other data, providing context and structure.

Pre-Built Models

Ready-to-use AI models that can be deployed quickly.

ODS (Operational Data Store)

An intermediate repository between operational systems and the data warehouse, used for real-time data integration.

OLAP (Online Analytical Processing)

Technology that enables complex, multidimensional analysis of stored data.

OLTP (Online Transaction Processing)

Transactional systems optimized for frequent data writing and updating.

Omnichannel

A multichannel sales approach that provides customers with an integrated experience across online, phone, and physical store interactions.

Data Organization

The method of structuring and arranging data to make it more usable.

Data Pipeline

A set of automated processes that move data between systems.

Pipelines

Automated workflows within the iPaaS layer that define the sequence of data integration and transformation steps.

Power BI

A Microsoft business analytics service that provides interactive visualizations and BI capabilities with a user-friendly interface for report and dashboard creation.

Real-Time Processing

The processing of data immediately upon input into the system.

Publication

The process of making AI agents and their functionalities available for use.

Multi-Channel Publishing

The ability to deploy AI agents across multiple communication channels.

Data Segmentation

The process of dividing a dataset into distinct groups based on shared characteristics.

Silos

Isolated data systems or departments within an organization that do not share information effectively.

Snowflake Schema

A database model that normalizes dimensional tables, reducing redundancy but increasing query complexity.

Star Schema

A database model with a central fact table and denormalized dimension tables, optimizing query performance.

Staging Area

A temporary area in a data warehouse where data is loaded before final processing.

Token

A basic unit of text processed by LLMs. Words or parts of words may be tokens.

Tokenization

The process of breaking text into smaller units called tokens for LLM processing.

Transformation

The act of changing or modifying data or processes.

Data Transformation

The ETL step where data is manipulated to match the format of the data warehouse.

Intelligent Use of Tools

The ability of AI agents to use available tools and skills to interpret context and perform actions accurately.

Veracity

One of the 5 Vs of Big Data, referring to the reliability of collected data.

Volume

Refers to the capacity to handle large amounts of data.

Last updated