-
Notifications
You must be signed in to change notification settings - Fork 25
Open
Description
I have installed td-agent in RHEL7 machine with 2 CPU and 4 GB RAM
- td-agent 1.2.6
- gem 'fluent-plugin-netflow' version '0.2.8'
- gem 'fluentd' version '1.2.6'
Tested with the packet generator below:
- https://github.com/mshindo/NetFlow-Generator
- ./flowgen -n10000 -i50 -w1 -p5160 localhost
Each time I am sending data using the packet generator and writing to a file I am not getting more than 3000 events. I have tested this by sending 5,000, 10,000 and 50,000 packets. Never got more than 3000 packets written to the file
td-agent.conf
<source>
@type netflow
tag netflow.event
port 5160
cache_ttl 6000
versions [5, 9]
</source>
<match netflow.event>
@type file
@log_level debug
path /apps/tddump/dump1
buffer_type file
buffer_queue_limit 50000
buffer_chunk_limit 24m
flush_interval 30
</match>
td-agent logs:
2018-11-02 09:38:52 -0400 [info]: using configuration file: <ROOT>
<source>
@type netflow
tag "netflow.event"
port 5160
cache_ttl 6000
versions [5,9]
</source>
<match netflow.event>
@type file
@log_level "debug"
path "/apps/tddump/dump1"
buffer_type file
buffer_queue_limit 50000
buffer_chunk_limit 24m
flush_interval 30
<buffer time>
@type file
flush_interval 30
chunk_limit_size 24m
queue_limit_length 50000
timekey 86400
path /apps/tddump/dump1
</buffer>
</match>
</ROOT>
2018-11-02 09:38:52 -0400 [info]: starting fluentd-1.2.6 pid=605 ruby="2.4.4"
2018-11-02 09:38:52 -0400 [info]: spawn command to main: cmdline=["/opt/td-agent/embedded/bin/ruby", "-Eascii-8bit:ascii-8bit", "/opt/td-agent/embedded/bin/fluentd", "--log", "/var/log/td-agent/td-agent.log", "--daemon", "/var/run/td-agent/td-agent.pid", "--under-supervisor"]
2018-11-02 09:38:53 -0400 [info]: gem 'fluent-plugin-elasticsearch' version '2.11.11'
2018-11-02 09:38:53 -0400 [info]: gem 'fluent-plugin-kafka' version '0.7.9'
2018-11-02 09:38:53 -0400 [info]: gem 'fluent-plugin-netflow' version '0.2.8'
2018-11-02 09:38:53 -0400 [info]: gem 'fluent-plugin-record-modifier' version '1.1.0'
2018-11-02 09:38:53 -0400 [info]: gem 'fluent-plugin-rewrite-tag-filter' version '2.1.0'
2018-11-02 09:38:53 -0400 [info]: gem 'fluent-plugin-s3' version '1.1.6'
2018-11-02 09:38:53 -0400 [info]: gem 'fluent-plugin-td' version '1.0.0'
2018-11-02 09:38:53 -0400 [info]: gem 'fluent-plugin-td-monitoring' version '0.2.4'
2018-11-02 09:38:53 -0400 [info]: gem 'fluent-plugin-webhdfs' version '1.2.3'
2018-11-02 09:38:53 -0400 [info]: gem 'fluentd' version '1.2.6'
2018-11-02 09:38:53 -0400 [info]: adding match pattern="netflow.event" type="file"
2018-11-02 09:38:53 -0400 [info]: #0 'flush_interval' is configured at out side of <buffer>. 'flush_mode' is set to 'interval' to keep existing behaviour
2018-11-02 09:38:53 -0400 [info]: adding source type="netflow"
2018-11-02 09:38:53 -0400 [info]: #0 starting fluentd worker pid=615 ppid=610 worker=0
2018-11-02 09:38:53 -0400 [debug]: #0 buffer started instance=70093046745520 stage_size=0 queue_size=0
2018-11-02 09:38:53 -0400 [debug]: #0 flush_thread actually running
2018-11-02 09:38:53 -0400 [info]: #0 listening netflow socket on 0.0.0.0:5160 with udp
2018-11-02 09:38:53 -0400 [info]: #0 fluentd worker is now running worker=0
2018-11-02 09:38:53 -0400 [debug]: #0 enqueue_thread actually running
2018-11-02 09:41:14 -0400 [debug]: #0 Created new chunk chunk_id="579aeb06b11c1609e89807a451f5f4a0" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=1541131200, tag=nil, variables=nil>
I installed Logstash in the same machine and was able to write 10,000, 100,000 and even 3 million events to a file with the following config:
input {
udp {
type => "netflow"
port => 5160
codec => netflow {
versions => [5,9]
}
workers => 2
receive_buffer_bytes => 212992
queue_size => 500000
}
}
output {
file {
path => "/apps/dump/dump2"
}
}
Metadata
Metadata
Assignees
Labels
No labels