基于Flume+Log4j+Kafka的日志采集架构方案(5)

<dependencies> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-api</artifactId> <version>2.5</version> </dependency> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-core</artifactId> <version>2.5</version> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-core</artifactId> <version>2.7.4</version> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> <version>2.7.4</version> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-annotations</artifactId> <version>2.7.4</version> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka_2.11</artifactId> <version>0.9.0.1</version> </dependency> </dependencies>

log4j2.xml配置文件如下所示:

<?xml version="1.0" encoding="UTF-8"?> <!-- Log4j2 的配置文件 --> <Configuration status="DEBUG" strict="true" name="LOG4J2_DEMO" packages="com.banksteel.log.demo.log4j2"> <properties> <property name="logPath">log</property> </properties> <Appenders> <!--配置控制台输出样式--> <Console name="Console" target="SYSTEM_OUT"> <PatternLayout pattern="%highlight{%d{yyyy-MM-dd HH:mm:ss} %d{UNIX_MILLIS} [%t] %-5p %C{1.}:%L - %msg%n}"/> </Console> <!-- 配置Kafka日志主动采集,Storm会将日志解析成字段存放在HBase中。 --> <Kafka name="Kafka" topic="server_log"> <!--使用JSON传输日志文件--> <JsonLayout complete="true" locationInfo="true"/> <!--Kafka集群配置,需要在本机配置Hosts文件,或者通过Nginx配置--> <Property name="bootstrap.servers">Kafka-01:9092,Kafka-02:9092,Kafka-03:9092</Property> </Kafka> </Appenders> <Loggers> <Root level="DEBUG"> <!--启用控制台输出日志--> <AppenderRef ref="Console"/> <!--启用Kafka采集日志--> <AppenderRef ref="Kafka"/> </Root> </Loggers> </Configuration>

这样就Okay了,我们可以在Kafka中看到完整的输出:

内容版权声明:除非注明,否则皆为本站原创文章。

转载注明出处:https://www.heiqu.com/34e1564505a6b7a1c6395fcee32dc310.html