webhook 서버 만들기
1. 들어오는 HTTP 요청을 수신하고 body를 출력하는 서버 생성를 생성합니다. (wehbookserer.sh)
#!/bin/bash
PORT=9090
while true; do
#ncat -l $PORT '
nc -l -p $PORT | while read -r line; do
echo $line
done
#echo -e "HTTP/1.1 200 OK\r\nContent-Type: text/palin\r\n\r\nWebhook received"
done
스크립트를 실행하여 들어오는 요청을 기다립니다.
2. kibana 에서 webhook 커넥터를 생성하고 다음 body를 보내면, webhook 서버가 다음을 표시합니다. (에러가 있지만 우선 넘어가요~)


규칙에서 wehbook 커넥터를 연동합니다.
simulate
POST /_ingest/pipeline/_simulate?verbose
{
"pipeline": {
"processors": [
{
"script": {
"lang": "painless",
"source": """
long bytes = Processors.bytes(ctx.size);
ctx.size_in_bytes = bytes;
"""
}
},
{
"script": {
"lang": "painless",
"source": """
def time = "2025-02-23T01:26:10.593051376Z";
ZonedDateTime zdt = ZonedDateTime.parse(time);
int year = zdt.getYear();
int month = zdt.getMonthValue();
int day = zdt.getDayOfMonth();
String newDate = Integer.toString(year) + (month < 10 ? '0' + Integer.toString(month) : Integer.toString(month)) + (day < 10 ? '0' + Integer.toString(day) : Integer.toString(day));
Random random = new Random();
int randomNumber = 10000000 + random.nextInt(90000000);
String finalResult = newDate + randomNumber;
ctx['megid'] = "Msg" + finalResult;
"""
}
}
]
},
"docs": [
{
"_source": {
"size": "1kb",
"name" : "myalert"
}
}
]
}
my-alerts-pipeline
PUT _ingest/pipeline/my-alert-pipeline
{
"description" : "my-alert-pipeline-test",
"processors" : [
{
"script": {
"lang": "painless",
"source": """
ZonedDateTime zdt = ZonedDateTime.parse("ctx['@timestamp']");
int year = zdt.getYear();
int month = zdt.getMonthValue();
int day = zdt.getDayOfMonth();
String newDate = Integer.toString(year).substring(2) + (month < 10 ? '0' + Integer.toString(month) : Integer.toString(month)) + (day < 10 ? '0' + Integer.toString(day) : Integer.toString(day));
Random random = new Random();
int randomNumber = 10000000 + random.nextInt(90000000);
String finalResult = newDate + randomNumber;
ctx['megid'] = "Msg-" + finalResult;
"""
}
}
]
}
포
PUT /products
{
"mappings": {
"properties": {
"@timestamp": {
"type": "date"
}
}
}
}
이벤트 doc 업데이트
POST /logs-kbdrule.alerts-default/_doc/
{
"@timestamp": "2025-02-23T06:01:16.671Z",
"product_name": "Apple iPhone 13",
"price": 801,
"category": "Smartphones"
}
RULE -> Watch'
crontab 수정
crontab -e
crontab 사용하기
* * * * * command_to_run
- - - - -
| | | | |
| | | | +---- Day of the week (0 - 6) (Sunday=0)
| | | +------ Month (1 - 12)
| | +-------- Day of the month (1 - 31)
| +---------- Hour (0 - 23)
+------------ Minute (0 - 59)
예제
# Run a script every day at 3:30 AM
30 3 * * * /path/to/your/script.sh
# Run a command at 5:00 PM on the 1st of every month
0 17 1 * * /path/to/your/command
# Run a script every Monday at 8:00 AM
0 8 * * 1 /path/to/your/script.sh
# Run a script every day at midnight (00:00)
0 0 * * * /path/to/your/script.sh
# Run a script every day at 09:00 AM
0 9 * * * /path/to/your/script.sh
Save and Exit
To verify your cron jobs, you can list them with:
crontab -l
{{context.alerts.0.signal.rule.updated_at}}
aggregation
GET logs-sentinel_one.edr-default/_search
{
"size": 0,
"aggs": {
"ratings": {
"terms": {
"field": "event.category"
}
}
}
}
쿼리 문과 같이
GET logs-sentinel_one.edr-default/_search
{
"size": 0,
"query": {
"match": {
"s1.endpoint.name": "DESKTOP-QG3AB71"
}
},
"aggs": {
"ratings": {
"terms": {
"field": "event.category"
}
}
}
}
#histogram
GET logs-sentinel_one.edr-default/_search
{
"size": 0,
"query": {
"match": {
"s1.endpoint.name": "DESKTOP-QG3AB71"
}
},
"aggs": {
"filesize": {
"histogram": {
"field": "file.size",
"interval": 100000
}
}
}
}
GET logs-sentinel_one.edr-default/_search
{
"size": 0,
"query": {
"match": {
"s1.endpoint.name": "DESKTOP-QG3AB71"
}
},
"aggs": {
"timestamp": {
"date_histogram": {
"field": "@timestamp",
"calendar_interval": "hour"
}
}
}
}'제품 > ELK' 카테고리의 다른 글
| syslog 분석 중 (0) | 2025.04.02 |
|---|---|
| logstash jdbc input 플러그인 (0) | 2025.03.27 |
| 데이터 가져오기 (0) | 2025.02.15 |
| 로그스태시로 데이터베이스에서 레코드 가져오기 (0) | 2025.02.13 |
| ES|QL 쿼리 (0) | 2025.02.09 |