Overview
The platform automatically detects when a workflow node produces JSON Lines output. Structured output is any node output where each line is a valid JSON object. When this is detected, you can create a Live Table directly from that output without defining a schema manually. Live Tables are updated after every workflow run, so your database always reflects the latest results.You do not need to configure output formats or install additional nodes. Any node whose output contains valid JSON Lines will appear in the Detected from workflow section of the Database tab.
How Detection Works
After a workflow run completes, the platform inspects node outputs for JSON Lines format. Each detected output appears as a candidate in the Database tab under Detected from workflow, along with a preview of the fields found and sample values from the actual output.Creating a Live Table
Open the Database tab
Navigate to your workflow and click the Database tab. If any node outputs have been detected as structured JSON Lines, they appear under the Detected from workflow section.
Select a detected output
Click Create Live Table next to the output you want to use as a data source. A configuration panel opens with a preview of the detected fields and sample data from the last run.
Name the table
Enter a name for the table. Use a name that reflects the content (e.g.,
open_ports, discovered_subdomains).Configure columns
Review the list of detected fields. For each field you want to include as a column:
The sample data preview updates as you make changes, so you can verify the column will be populated correctly.
- Toggle the field on to include it in the table.
- Set the data type for the column.
| Type | Use for |
|---|---|
text | Hostnames, URLs, strings |
int | Ports, counts, numeric scores |
int64 | Large integers exceeding standard int range |
float | Decimal numbers |
float64 | High-precision decimal numbers |
bool | True/false flags |
uuid | Identifiers |
datetime | Timestamps |
data | Raw or complex values |
Set the primary key
Toggle Primary key on for at least one column. You can select multiple columns to form a composite primary key.Choosing a good primary key:
The primary key uniquely identifies each record. When new data arrives from a workflow run, records with a matching primary key are updated rather than duplicated. Records with a new primary key are inserted as new rows.
- Use fields that are stable and unique per logical record (e.g.,
hostnamefor assets,ip + portfor services). - Avoid high-cardinality fields that change between runs if you want to track changes over time rather than create new rows.
How Live Tables Are Updated
After each workflow run, the platform writes new output to all connected Live Tables using the following logic:- Primary key match found: the non-key columns for that record are updated with the new values.
- No primary key match: a new row is inserted.
- Row not present in new output: the existing row is left unchanged until you act on it (see Schema and data changes below).
Schema and Data Changes
As workflows evolve, the output structure may change between runs. The platform handles two cases:New field detected
If a run produces a JSON object with a field that does not exist as a column in the table, the platform notifies you. You can choose to:- Add the column to the table with a chosen data type.
- Ignore the field, in which case it is excluded from future imports until you act on it.
Field no longer present
If a field that previously existed in the output is no longer found in a run’s data, the platform flags the column as missing from the source. You can choose to:- Keep the column as-is. Existing values are preserved; new rows will have no value for that column.
- Delete the column from the table entirely, which removes all stored values for that field.
Troubleshooting
No outputs appear under 'Detected from workflow'
No outputs appear under 'Detected from workflow'
Possible causes:
- The workflow has not been run yet. Run the workflow at least once to generate output.
- Node output is not valid JSON Lines. Each line must be a complete, standalone JSON object. Arrays, multi-line JSON, and plain text are not detected.
- The run failed before any output was produced. Check the run status in the Runs tab.
Sample data looks incorrect or truncated
Sample data looks incorrect or truncated
Sample data is drawn from the last completed run. If the output is very large, only a subset of rows is shown in the preview. The full dataset is used when the Live Table is created.If values look wrong, verify the node output format directly by inspecting the run’s output file.
Duplicate rows appearing in the table
Duplicate rows appearing in the table
Duplicate rows indicate that the primary key is not unique enough. For example, using only
hostname as a primary key when the same hostname can appear with different ports will cause updates rather than inserts, which may not be the intended behavior.Review your primary key selection and consider using a composite key (e.g., hostname + port) to uniquely identify each logical record.