Skip to main content

Overview

The platform automatically detects when a workflow node produces JSON Lines output. Structured output is any node output where each line is a valid JSON object. When this is detected, you can create a Live Table directly from that output without defining a schema manually. Live Tables are updated after every workflow run, so your database always reflects the latest results.
You do not need to configure output formats or install additional nodes. Any node whose output contains valid JSON Lines will appear in the Detected from workflow section of the Database tab.

How Detection Works

After a workflow run completes, the platform inspects node outputs for JSON Lines format. Each detected output appears as a candidate in the Database tab under Detected from workflow, along with a preview of the fields found and sample values from the actual output.

Creating a Live Table

1

Open the Database tab

Navigate to your workflow and click the Database tab. If any node outputs have been detected as structured JSON Lines, they appear under the Detected from workflow section.
2

Select a detected output

Click Create Live Table next to the output you want to use as a data source. A configuration panel opens with a preview of the detected fields and sample data from the last run.
3

Name the table

Enter a name for the table. Use a name that reflects the content (e.g., open_ports, discovered_subdomains).
4

Configure columns

Review the list of detected fields. For each field you want to include as a column:
  • Toggle the field on to include it in the table.
  • Set the data type for the column.
TypeUse for
textHostnames, URLs, strings
intPorts, counts, numeric scores
int64Large integers exceeding standard int range
floatDecimal numbers
float64High-precision decimal numbers
boolTrue/false flags
uuidIdentifiers
datetimeTimestamps
dataRaw or complex values
The sample data preview updates as you make changes, so you can verify the column will be populated correctly.
5

Set the primary key

Toggle Primary key on for at least one column. You can select multiple columns to form a composite primary key.
The primary key uniquely identifies each record. When new data arrives from a workflow run, records with a matching primary key are updated rather than duplicated. Records with a new primary key are inserted as new rows.
Choosing a good primary key:
  • Use fields that are stable and unique per logical record (e.g., hostname for assets, ip + port for services).
  • Avoid high-cardinality fields that change between runs if you want to track changes over time rather than create new rows.
6

Confirm creation

Click Create Live Table. The table is created immediately and populated with data from the last run.

How Live Tables Are Updated

After each workflow run, the platform writes new output to all connected Live Tables using the following logic:
  • Primary key match found: the non-key columns for that record are updated with the new values.
  • No primary key match: a new row is inserted.
  • Row not present in new output: the existing row is left unchanged until you act on it (see Schema and data changes below).
This means your table accumulates all discovered records over time, with non-key fields always reflecting the most recent values seen.

Schema and Data Changes

As workflows evolve, the output structure may change between runs. The platform handles two cases:

New field detected

If a run produces a JSON object with a field that does not exist as a column in the table, the platform notifies you. You can choose to:
  • Add the column to the table with a chosen data type.
  • Ignore the field, in which case it is excluded from future imports until you act on it.

Field no longer present

If a field that previously existed in the output is no longer found in a run’s data, the platform flags the column as missing from the source. You can choose to:
  • Keep the column as-is. Existing values are preserved; new rows will have no value for that column.
  • Delete the column from the table entirely, which removes all stored values for that field.
Deleting a column is permanent. All data stored in that column is removed and cannot be recovered.

Troubleshooting

Possible causes:
  • The workflow has not been run yet. Run the workflow at least once to generate output.
  • Node output is not valid JSON Lines. Each line must be a complete, standalone JSON object. Arrays, multi-line JSON, and plain text are not detected.
  • The run failed before any output was produced. Check the run status in the Runs tab.
Sample data is drawn from the last completed run. If the output is very large, only a subset of rows is shown in the preview. The full dataset is used when the Live Table is created.If values look wrong, verify the node output format directly by inspecting the run’s output file.
Duplicate rows indicate that the primary key is not unique enough. For example, using only hostname as a primary key when the same hostname can appear with different ports will cause updates rather than inserts, which may not be the intended behavior.Review your primary key selection and consider using a composite key (e.g., hostname + port) to uniquely identify each logical record.

Next Steps