Source config
- Port
- Data format. Allowed values:
JSON, DELIMITED, AVRO, LOG. Default - JSON. Data formats config examples
- Grok pattern definitions file path - Specify path to a file with predefined grok patterns
- Grok pattern - Specify an actual grok pattern to parse the message
- Delimited format type (CSV, CUSTOM) -
CSV - default, comma-separated format
- Custom delimiter character
- Change fields names for
DELIMITED data. Format - key:val,key2:val2,key3:val3
- Schema file path for
AVRO format - specify file path with json schema
- Max Batch Size (records) - how many records to send to further pipeline stages. Default - 1000 records
- Batch Wait Time (ms) - how many time to wait until batch will reach it's size. Default - 1000 ms
Example:
> agent source create
Choose source (influx, kafka, mongo, mysql, postgres, elastic, splunk): splunk
Enter a unique source name: splunk
Ports: 9999
Data format (JSON, DELIMITED, AVRO, LOG) [JSON]:
Source config created
> agent source create
Choose source (influx, kafka, mongo, mysql, postgres, elastic, splunk): splunk
Enter a unique source name: splunk
Ports: 9998
Data format (JSON, DELIMITED, AVRO, LOG) [JSON]: LOG
Grok pattern definitions file path []:
Grok pattern: %{COMMONAPACHELOG}
Source config created
> agent source create
Choose source (influx, kafka, mongo, mysql, postgres, elastic, splunk): splunk
Enter unique name for this source config: splunk_csv
Ports: 9996
Data format (JSON, DELIMITED, AVRO, LOG) [JSON]: DELIMITED
Delimited format type (CSV, CUSTOM): CUSTOM
Custom delimiter character: |
Connecting to the source. Check again after 2 seconds...
...
Change fields names (format - key:val,key2:val2,key3:val3) []: 0:timestamp,2:ver,4:Country,7:Clicks
Source config created
Source file config
| Property |
Type |
Description |
type |
String |
Specify source type. Value - splunk |
name |
String |
Unique source name - also the config file name |
config |
Object |
Source configuration |
All properties are required
config object properties:
| Property |
Type |
Required |
Description |
conf.ports |
String |
Yes |
List of kafka brokers, separated with commas |
conf.dataFormat |
String |
no |
Allowed values: JSON, DELIMITED, AVRO, LOG. Default - JSON |
conf.dataFormatConfig.csvFileFormat |
String |
no |
Allowed values: CSV, CUSTOM. Default - CSV |
conf.csvCustomDelimiter |
String |
no |
Custom delimiter |
csv_mapping |
object |
no |
Names of columns for delimited data |
conf.dataFormatConfig.avroSchemaSource |
String |
no |
Allowed values SOURCE (schema is present in data itself), INLINE (specify schema in conf.dataFormatConfig.avroSchema parameter), REGISTRY (Confluent schema registry) |
conf.dataFormatConfig.avroSchema |
Object |
no |
Avro schema (json object) |
conf.dataFormatConfig.schemaRegistryUrls |
Array |
no |
Schema registry urls |
conf.dataFormatConfig.schemaLookupMode |
String |
no |
How to look up a schema in the registry. Allowed values SUBJECT, ID, AUTO |
conf.dataFormatConfig.subject |
String |
no |
Schema subject (specify if schemaLookupMode is SUBJECT) |
conf.dataFormatConfig.schemaId |
String |
no |
Schema id (specify if schemaLookupMode is ID) |
grok_definition_file |
String |
no |
File with grok patterns |
conf.dataFormatConfig.grokPattern |
String |
no |
Grok pattern to parse the message |
Example
{
"type": "splunk_server",
"name": "test_splunk_csv",
"config": {
"conf.ports": ["9997"],
"conf.dataFormat": "DELIMITED",
"csv_mapping": {"0": "timestamp_unix", "2": "ver", "4": "Country", "6": "Exchange", "7": "Clicks"}
}
}
Pipeline config
Pipeline config