The Secure60 Collector offers a powerful data enrichment feature that allows you to augment your log data with additional information based on the source IP address. This is particularly useful for adding context such as department, business unit, location, or any other relevant metadata to your events.
This enrichment strategy works by taking an IP address field (by default, ip_src_address
, but configurable via the ENRICH_SUBNET_SOURCE_FIELD
environment variable) from an incoming event, calculating its subnet based on a configurable prefix/mask, and then looking up this subnet in a CSV (Comma Separated Values) file that you provide. If a match is found, specified fields from the CSV row are merged into the event.
Default Fields: The collector automatically maps the following fields from your CSV if they exist: source_department
, source_business_unit
, source_location
, source_criticality
, technology_group
, and environment
. You only need to specify the ENRICH_SUBNET_MAPPING_FIELDS
environment variable if you want to add additional columns from your CSV or use different field names.
ip_src_address
).192.168.1.123
and the lookup prefix is /24
, the calculated subnet will be 192.168.1.0
.192.168.1.0
) in a dedicated column (by default, named subnet
) within your mappings CSV file.ENRICH_SUBNET_MAPPING_FIELDS
environment variable and adds them as new fields to the event.You need to create a CSV file that contains your subnet mappings.
192.168.1.0
, 10.0.0.0
). The default name for this column is subnet
, but it should match the key used in the internal lookup (which is hardcoded to subnet
for the find_enrichment_table_records
function based on the s60-core.yaml
).source_department
, source_business_unit
, source_location
).Example mappings_subnet.csv
:
subnet,source_department,source_business_unit,source_location,source_criticality,technology_group,environment
192.168.1.0,IT,Infrastructure,Data Center A,High,Web Services,Production
10.0.0.0,Sales,CRM Team,Cloud,Medium,Application,Production
172.16.0.0,Engineering,R&D,Main Campus,Critical,Development,Development
To make your CSV file accessible to the Secure60 Collector running in Docker, you need to mount it as a volume.
docker run
:If your CSV file is named mappings_subnet.csv
and is located in your current directory (./mappings_subnet.csv
), you would mount it to the default path /etc/vector/mappings_subnet.csv
inside the container:
docker run -i --name s60-collector \
-p 80:80 -p 443:443 -p 514:514/udp -p 6514:6514 -p 5044:5044 \
-v ./mappings_subnet.csv:/etc/vector/mappings_subnet.csv \
--rm -d --env-file .env secure60/s60-collector:1.08
Make sure to adjust ./mappings_subnet.csv
if your file is in a different location or has a different name. If you change the target path inside the container, you must also update the ENRICH_SUBNET_MAPPINGS_FILE
environment variable.
docker-compose.yaml
:services:
s60-collector:
image: "secure60/s60-collector:1.08"
container_name: "s60-collector"
ports:
- "443:443"
- "80:80"
- "514:514/udp"
- "6514:6514"
volumes:
- ./mappings_subnet.csv:/etc/vector/mappings_subnet.csv
env_file:
- .env
restart: 'always'
logging:
driver: "json-file"
options:
max-size: "50m"
max-file: "10"
Again, ensure the host path (./mappings_subnet.csv
) correctly points to your file. The container path should match what ENRICH_SUBNET_MAPPINGS_FILE
expects, or you should set that variable accordingly.
The subnet enrichment feature is controlled by the following environment variables, which you would typically set in your .env
file:
ENRICH_SUBNET_ENABLE
true
or false
.false
.ENRICH_SUBNET_ENABLE=true
ENRICH_SUBNET_SOURCE_FIELD
ip_src_address
.ENRICH_SUBNET_SOURCE_FIELD=source.ip
ENRICH_SUBNET_MAPPINGS_FILE
/etc/vector/mappings_subnet.csv
.ENRICH_SUBNET_MAPPINGS_FILE=/etc/vector/custom_subnet_data.csv
(if you mounted your CSV to a different path).ENRICH_SUBNET_LOOKUP_PREFIX
/24
) or subnet mask (e.g., 255.255.255.0
) used to calculate the subnet from the IP address field for the lookup./24
.ENRICH_SUBNET_LOOKUP_PREFIX=/16
or ENRICH_SUBNET_LOOKUP_PREFIX=255.255.0.0
ENRICH_SUBNET_MAPPING_FIELDS
source_department,source_business_unit,source_location,source_criticality,technology_group,environment
.ENRICH_SUBNET_MAPPING_FIELDS=source_department,source_business_unit,source_location,source_criticality,technology_group,environment,cost_center,asset_tag
Imagine you are collecting firewall logs. These logs contain the source IP address of traffic but lack contextual information about what that IP address represents in your organization.
You prepare a mappings_subnet.csv
file:
subnet,source_department,source_business_unit,source_location,source_criticality,technology_group,environment
10.1.10.0,Finance,Corporate HQ,New York Office,High,Web Services,Production
10.1.20.0,HR,Corporate HQ,New York Office,Medium,Application,Production
192.168.5.0,Guest WiFi,Annex A,Guest Network,Low,Network Infrastructure,Production
You configure your .env
file:
ENRICH_SUBNET_ENABLE=true
ENRICH_SUBNET_MAPPINGS_FILE=/etc/vector/mappings_subnet.csv
ENRICH_SUBNET_LOOKUP_PREFIX=/24
# No need to specify ENRICH_SUBNET_MAPPING_FIELDS unless you want additional fields
You deploy the collector, ensuring mappings_subnet.csv
is mounted to /etc/vector/mappings_subnet.csv
.
Now, an incoming event like this:
{
"timestamp": "2023-10-26T10:00:00Z",
"ip_src_address": "10.1.10.55",
"action": "allowed",
"destination_port": 443
}
Will be enriched by the collector to become:
{
"timestamp": "2023-10-26T10:00:00Z",
"ip_src_address": "10.1.10.55",
"action": "allowed",
"destination_port": 443,
"source_department": "Finance",
"source_business_unit": "Corporate HQ",
"source_location": "New York Office",
"source_criticality": "High",
"technology_group": "Web Services",
"environment": "Production"
}
This enriched data provides much more context for analysis, alerting, and reporting within the Secure60 platform.
In addition to subnet-based enrichment, the Secure60 Collector supports exact field matching enrichment. This strategy works by taking any field value from an incoming event (by default, host_name
, but configurable via the ENRICH_CUSTOM_EXACT_SOURCE_FIELD
environment variable) and performing an exact match lookup against a CSV file. If a match is found, specified fields from the CSV row are merged into the event.
Default Fields: The collector automatically maps the following fields from your CSV if they exist: source_department
, source_business_unit
, source_location
, source_criticality
, technology_group
, and environment
. You only need to specify the ENRICH_CUSTOM_EXACT_MAPPING_FIELDS
environment variable if you want to add additional columns from your CSV or use different field names.
host_name
).field_value
column of your mappings CSV file.ENRICH_CUSTOM_EXACT_MAPPING_FIELDS
environment variable and adds them as new fields to the event.You need to create a CSV file that contains your exact field mappings.
field_value
and contain the exact values you want to match against (e.g., webserver01
, firewall-dmz
, crm-app-staging
).source_department
, source_business_unit
, source_location
).Example mappings_exact.csv
:
field_value,source_department,source_business_unit,source_location,source_criticality,technology_group,environment
webserver01,IT,Infrastructure,Headquarters,High,Web Services,Production
firewall-dmz,Security,Networking,Perimeter,Critical,Firewall,Production
vpn-gateway01,IT,Networking,Remote Access,High,VPN,Production
kube-master-01,DevOps,Platform Engineering,Kubernetes Cluster,Critical,Orchestration,Production
marketing-site-prod,Marketing,Web Team,Public Website,Medium,Web Server,Production
crm-app-staging,Sales,CRM Team,Staging Environment,Medium,Application,Staging
ldap-auth-primary,IT,Identity Management,Authentication Services,Critical,LDAP,Production
To make your CSV file accessible to the Secure60 Collector running in Docker, you need to mount it as a volume.
docker run
:If your CSV file is named mappings_exact.csv
and is located in your current directory (./mappings_exact.csv
), you would mount it to the default path /etc/vector/mappings_exact.csv
inside the container:
docker run -i --name s60-collector \
-p 80:80 -p 443:443 -p 514:514/udp -p 6514:6514 -p 5044:5044 \
-v ./mappings_exact.csv:/etc/vector/mappings_exact.csv \
--rm -d --env-file .env secure60/s60-collector:1.08
Make sure to adjust ./mappings_exact.csv
if your file is in a different location or has a different name. If you change the target path inside the container, you must also update the ENRICH_CUSTOM_EXACT_FILE
environment variable.
docker-compose.yaml
:services:
s60-collector:
image: "secure60/s60-collector:1.08"
container_name: "s60-collector"
ports:
- "443:443"
- "80:80"
- "514:514/udp"
- "6514:6514"
volumes:
- ./mappings_exact.csv:/etc/vector/mappings_exact.csv
env_file:
- .env
restart: 'always'
logging:
driver: "json-file"
options:
max-size: "50m"
max-file: "10"
Again, ensure the host path (./mappings_exact.csv
) correctly points to your file. The container path should match what ENRICH_CUSTOM_EXACT_FILE
expects, or you should set that variable accordingly.
The exact field matching enrichment feature is controlled by the following environment variables, which you would typically set in your .env
file:
ENRICH_CUSTOM_EXACT_ENABLE
true
or false
.false
.ENRICH_CUSTOM_EXACT_ENABLE=true
ENRICH_CUSTOM_EXACT_SOURCE_FIELD
host_name
.ENRICH_CUSTOM_EXACT_SOURCE_FIELD=application_name
ENRICH_CUSTOM_EXACT_FILE
/etc/vector/mappings_exact.csv
.ENRICH_CUSTOM_EXACT_FILE=/etc/vector/custom_exact_data.csv
(if you mounted your CSV to a different path).ENRICH_CUSTOM_EXACT_MAPPING_FIELDS
source_department,source_business_unit,source_location,source_criticality,technology_group,environment
.ENRICH_CUSTOM_EXACT_MAPPING_FIELDS=source_department,source_business_unit,source_location,source_criticality,technology_group,environment,cost_center,asset_tag
Imagine you are collecting application logs from various servers and applications. These logs contain hostnames or application names but lack organizational context about what these systems represent.
You prepare a mappings_exact.csv
file:
field_value,source_department,source_business_unit,source_location,source_criticality,technology_group,environment
web-prod-01,IT,Infrastructure,Data Center A,High,Web Services,Production
db-primary,IT,Infrastructure,Data Center A,Critical,Database,Production
crm-staging,Sales,CRM Team,Cloud,Medium,Application,Staging
analytics-worker,Data Engineering,Analytics,Data Center B,Medium,Big Data,Production
auth-service,Security,Identity Management,Cloud,Critical,Authentication,Production
You configure your .env
file:
ENRICH_CUSTOM_EXACT_ENABLE=true
ENRICH_CUSTOM_EXACT_SOURCE_FIELD=host_name
ENRICH_CUSTOM_EXACT_FILE=/etc/vector/mappings_exact.csv
# No need to specify ENRICH_CUSTOM_EXACT_MAPPING_FIELDS unless you want additional fields
You deploy the collector, ensuring mappings_exact.csv
is mounted to /etc/vector/mappings_exact.csv
.
Now, an incoming event like this:
{
"timestamp": "2023-10-26T10:00:00Z",
"host_name": "web-prod-01",
"service": "nginx",
"message": "Request processed successfully",
"response_code": 200
}
Will be enriched by the collector to become:
{
"timestamp": "2023-10-26T10:00:00Z",
"host_name": "web-prod-01",
"service": "nginx",
"message": "Request processed successfully",
"response_code": 200,
"source_department": "IT",
"source_business_unit": "Infrastructure",
"source_location": "Data Center A",
"source_criticality": "High",
"technology_group": "Web Services",
"environment": "Production"
}
Unlike subnet enrichment which is specifically designed for IP addresses, exact field matching can work with any field in your events. You can configure it to match against:
host_name
, hostname
, server_name
app_name
, application
, service_name
user_id
, username
, employee_id
device_id
, asset_tag
, serial_number
Simply change the ENRICH_CUSTOM_EXACT_SOURCE_FIELD
environment variable to point to the field you want to use for matching.
Both subnet-based and exact field matching enrichment can be enabled simultaneously. The collector will apply both enrichment strategies to your events, allowing you to add both network-based context (from subnet enrichment) and asset-specific context (from exact field matching) to the same events.
This enriched data provides comprehensive organizational context for analysis, alerting, and reporting within the Secure60 platform.
When working with CSV files for data enrichment, there are strict rules that must be followed to ensure proper functionality. These rules apply to both subnet-based and exact field matching enrichment.
If you change the column names or count inside your CSV file, you must update the corresponding mapping fields environment variable to match exactly.
For Subnet Enrichment:
ENRICH_SUBNET_MAPPING_FIELDS
source_department,source_business_unit,source_location,source_criticality,technology_group,environment
For Exact Field Matching:
ENRICH_CUSTOM_EXACT_MAPPING_FIELDS
source_department,source_business_unit,source_location,source_criticality,technology_group,environment
Example Problem:
subnet,department,business_unit,location,criticality,tech_group,env
192.168.1.0,IT,Infrastructure,Data Center A,High,Web Services,Production
Solution: You must update your environment variable to match the new column names:
ENRICH_SUBNET_MAPPING_FIELDS=department,business_unit,location,criticality,tech_group,env
The first column of your CSV file serves as the lookup key and must not be renamed or repositioned.
For Subnet Enrichment:
subnet
and contain network addresses (e.g., 192.168.1.0
)For Exact Field Matching:
field_value
and contain the exact values to match againstIncorrect Example:
network,source_department,source_business_unit # ❌ Wrong - first column renamed
192.168.1.0,IT,Infrastructure
Correct Example:
subnet,source_department,source_business_unit # ✅ Correct - first column unchanged
192.168.1.0,IT,Infrastructure
Every data row in your CSV must have the same number of columns as defined in the header row. Columns can be empty, but they cannot be missing.
Incorrect Example:
subnet,source_department,source_business_unit,source_location
192.168.1.0,IT,Infrastructure # ❌ Missing source_location column
10.0.0.0,Sales,CRM Team,Cloud,Extra # ❌ Too many columns
Correct Example:
subnet,source_department,source_business_unit,source_location
192.168.1.0,IT,Infrastructure, # ✅ Empty but present
10.0.0.0,Sales,CRM Team,Cloud # ✅ All columns present
172.16.0.0,,, # ✅ All empty but present
Symptoms:
Possible Causes and Solutions:
CSV file not mounted correctly
Field names don’t match
ENRICH_SUBNET_MAPPING_FIELDS
or ENRICH_CUSTOM_EXACT_MAPPING_FIELDS
exactly match your CSV column headersLookup field missing or incorrect
ENRICH_SUBNET_SOURCE_FIELD
(default: ip_src_address
)ENRICH_CUSTOM_EXACT_SOURCE_FIELD
(default: host_name
)CSV format issues
Symptoms:
Possible Causes and Solutions:
Incomplete CSV data
Mapping field mismatch
ENRICH_*_MAPPING_FIELDS
environment variable with actual CSV column namesSymptoms:
Possible Causes and Solutions:
File path issues
CSV parsing errors