About Daggy
Common information about Daggy and Getting Started
Free code signing on Windows povided by SignPath.io, certificate by SignPath Foundation
About Daggy
Daggy - Data Aggregation Utility and C/C++ developer library for data streams catching
Daggy main goals are server-less, cross-platform, simplicity and ease-of-use.
Daggy can be helpful for developers, QA, DevOps and engineers for debug, analyze and control any data streams, including requests and responses, in distributed network systems, for example, based on micro-service architecture.
Table of contents generated with markdown-toc
Introduction and goal concepts
The Daggy Project consist of:
Core - library for streams aggregation and catching
Daggy - console application for aggregation streams into files
Daggy High Level Design
Daggy High Level Design
Basic terms
The main goal of Daggy Software System is obtaining the data from envorinments that located in sources to streams into aggregators and via providers.
Environment contains data for streams. Out of box, Core supports local and remote environments, but can be extended by user defined environments. Local Environment is located on the same host, that Daggy Core instance. Remote Environment is located on the different from Daggy Core instance host. User defined environment can be located anywhere, likes databases, network disks, etc.
Sources are declarations, how to obtain the data from environments. It descirbes, which kind of data need to be conveted to streams and which provider will need.
There is example of sources that contains once local environment and once remote environment:
The streams from local environment are generates via local provider (looks at type: local).
The streams from remote environment are generates via ssh2 provider (looks at type: ssh2).
Out of box Core provides local and ssh2 providers. Both providers obtains the data for streams from processes - the local provider runs local process and generates streams from process channels (stdin and stdout). Ssh2 provider runs remote processes via ssh2 protocol and also generates streams from process channels. The Daggy Core can be extended by user defined provider that will generate streams, for example, from http environment.
Providers generate streams by parts via commands. The each part has unique seq_num value, uninterruptedly and consistently. It means, that full data from stream can be obtain by adding parts of stream in seq_num ascending order. Each stream can be generated by command.
The Core translates streams from any count of providers in once Core Streams Session. The streams from Core Streams Session can be aggregated by aggregators or viewed by user.
Out of box, the Core provides several types of aggregators:
File - aggregates streams into files at runtime, as data arrives. This aggregator is used by Daggy Console Application.
Console - aggreagates streams into console output. This aggregator is used by Daggy Console Application.
Callback - aggregates streams into ANSI C11 callbacks. This aggregator is used by Core ANSI C11 Interface.
The Core library can be extended by user defined aggregators.
Getting Started
Getting Daggy
Fedora
Windows
Download installer or portable version from releases page.
Linux
Download rpm/deb or portable version from releases page.
MacOS
Download portable version from releases page or install via homebrew:
Install from source with conan
Install from source with cmake (choose for maintainers)
The tweak number must set to zero. It means, if you get version 2.2.1 you need to set -DVERSION=2.2.1.0.
Add as conan package dependency
Get daggy from conan-center.
Check installation of Daggy Core C++17/20 interface
Check installation of Daggy Core C11 interface
Check installation of Daggy Console application
Getting Started data aggregation and streaming with Daggy Console Application
Simple Sources
Create simple.yaml
Run daggy
Check console output
There are all commands from simple.yaml/simple.json are streams in 01-04-20_23-07-23-977_simple with output files
Tailing streams from Simple Data Source
Stop data aggregation and streaming
Type CTRL+C for stopping data aggregation and streaming. Type CTRL+C twice for hard stop application, without waiting cancelation of child local and remote processes.
Investigate aggregated data
Example of Data Aggregation Sources with multiple commands and remote data aggregation and streaming
Last updated
Was this helpful?