background nginx-kafka-module is a plug-in for nginx that can integrate Kafka into nginx to facilitate the collection of data from front-end pages in web projects. If data is embedded on the front-end page, some user access and request data can be sent directly to the message middleware Kafka through http requests. The backend can consume messages in Kafka through programs to perform real-time calculations. For example, SparkStream can be used to consume data in Kafka in real time to analyze user PV, UV, some user behaviors, and the funnel model conversion rate of the page to better optimize the system or conduct real-time dynamic analysis of visiting users. Specific integration steps 1. Install git yum install -y git 2. Switch to the /usr/local/src directory and clone the Kafka C client source code to the local cd /usr/local/src git clone https://github.com/edenhill/librdkafka 3. Enter librdkafka and compile cd librdkafka yum install -y gcc gcc-c++ pcre-devel zlib-devel ./configure make && make install 4. Install the nginx integrated kafka plug-in, enter /usr/local/src, clone nginx integrated kafka source code cd /usr/local/src git clone https://github.com/brg-liuwei/ngx_kafka_module 5. Enter the nginx source package directory (compile nginx, and then compile the plug-in at the same time) cd /usr/local/src/nginx-1.12.2 ./configure --add-module=/usr/local/src/ngx_kafka_module/ make && make install 6. Modify the nginx configuration file: set a location and kafka topic. For details, please see nginx.conf in the current directory #Add configuration (2 places) kafka; kafka_broker_list f1:9092 f2:9092 f3:9092; location = /kafka/access { kafka_topic access888; } As shown below: 7. Start zk and kafka clusters (create topics) zkServer.sh start kafka-server-start.sh -daemon config/server.properties 8. Start nginx, report an error, and cannot find the kafka.so.1 file
9. Load the so library #Load the library under /usr/local/lib at boot time echo "/usr/local/lib" >> /etc/ld.so.conf #Manually load ldconfig 10. Test, write data to nginx, and then observe whether Kafka consumers can consume the data curl http://localhost/kafka/access -d "message send to kafka topic" curl http://localhost/kafka/access -d "Xiaowei666" test You can also simulate the page embedding request interface to send information: The background Kafka consumption information is as follows: The above is the full content of this article. I hope it will be helpful for everyone’s study. I also hope that everyone will support 123WORDPRESS.COM. You may also be interested in:
|
<<: js to implement verification code interference (static)
>>: js to implement verification code interference (dynamic)
<br />Reading is a very important part of th...
This article mainly explains tables, which are no...
add -it docker run -it -name test -d nginx:latest...
Table of contents 1. Solution 2. Let the browser ...
Docker provides multiple networks such as bridge,...
Table of contents Preface NULL in MySQL 2 NULL oc...
Table of contents 1. Basic SELECT statement 1. Qu...
The detailed installation process of mysql5.7.21 ...
The META tag is an auxiliary tag in the head area...
Hello everyone, I am Liang Xu. When using Linux, ...
Introduction The use of is null, is not null, and...
Table of contents Preface 1. Use $attrs and $list...
How is the MySQL Select statement executed? I rec...
Preface I am currently working on a high-quality ...
Table of contents Show Me The Code Test the effec...