Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
W
wj-datacenter-platform
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
jinan
wj-datacenter-platform
Commits
02a1ac3b
Commit
02a1ac3b
authored
May 22, 2024
by
hanbing
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
主控迁移Flink-接收 E1 数据
parent
08bc3541
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
88 additions
and
3 deletions
+88
-3
pom.xml
wj-realtime-computing/pom.xml
+7
-0
MainControlMain.java
...om/wanji/indicators/task/maincontrol/MainControlMain.java
+78
-0
config_dev.properties
...altime-computing/src/main/resources/config_dev.properties
+3
-3
No files found.
wj-realtime-computing/pom.xml
View file @
02a1ac3b
...
...
@@ -423,6 +423,13 @@
</exclusions>
</dependency>
<!-- SparkJava Dependency -->
<dependency>
<groupId>
com.sparkjava
</groupId>
<artifactId>
spark-core
</artifactId>
<version>
2.9.3
</version>
</dependency>
</dependencies>
...
...
wj-realtime-computing/src/main/java/com/wanji/indicators/task/maincontrol/MainControlMain.java
0 → 100644
View file @
02a1ac3b
package
com
.
wanji
.
indicators
.
task
.
maincontrol
;
import
com.wanji.indicators.util.PropertiesHelper
;
import
org.apache.flink.api.common.eventtime.WatermarkStrategy
;
import
org.apache.flink.connector.kafka.source.KafkaSource
;
import
org.apache.flink.connector.kafka.source.enumerator.initializer.OffsetsInitializer
;
import
org.apache.flink.connector.kafka.source.reader.deserializer.KafkaRecordDeserializationSchema
;
import
org.apache.flink.streaming.api.datastream.DataStream
;
import
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
;
import
org.apache.kafka.clients.producer.KafkaProducer
;
import
org.apache.kafka.clients.producer.ProducerConfig
;
import
org.apache.kafka.clients.producer.ProducerRecord
;
import
org.apache.kafka.common.serialization.StringDeserializer
;
import
org.apache.kafka.common.serialization.StringSerializer
;
import
java.util.Properties
;
import
static
spark
.
Spark
.
port
;
import
static
spark
.
Spark
.
post
;
public
class
MainControlMain
{
public
static
final
Properties
properties
=
PropertiesHelper
.
getInstance
().
getProperties
();
public
static
final
String
originE1Topic
;
public
static
final
String
kafkaServerAddress
;
private
static
final
PropertiesHelper
instance
=
PropertiesHelper
.
getInstance
();
static
{
originE1Topic
=
properties
.
getProperty
(
"origin.e1.topic"
);
kafkaServerAddress
=
properties
.
getProperty
(
"bootstrap.servers"
);
}
public
static
void
main
(
String
[]
args
)
throws
Exception
{
// 启动 HTTP 服务器
startHttpServer
();
// 启动 Flink 作业
final
StreamExecutionEnvironment
env
=
StreamExecutionEnvironment
.
getExecutionEnvironment
();
// 获取配置文件中的kafka消费topic
String
sourceTopic
=
properties
.
getProperty
(
"origin.e1.topic"
);
KafkaSource
<
String
>
source
=
KafkaSource
.<
String
>
builder
()
.
setProperties
(
instance
.
getConsumerProperties
())
.
setProperty
(
"auto.offset.commit"
,
"true"
)
.
setProperty
(
"auto.commit.interval.ms"
,
"1000"
)
.
setProperty
(
"commit.offsets.on.checkpoint"
,
"true"
)
.
setBootstrapServers
(
kafkaServerAddress
)
.
setTopics
(
sourceTopic
)
.
setGroupId
(
properties
.
getProperty
(
"consumer.group.id"
)
+
"-origin-e1"
)
.
setStartingOffsets
(
OffsetsInitializer
.
latest
())
.
setDeserializer
(
KafkaRecordDeserializationSchema
.
valueOnly
(
StringDeserializer
.
class
))
.
build
();
DataStream
<
String
>
stream
=
env
.
fromSource
(
source
,
WatermarkStrategy
.
noWatermarks
(),
"origin-e1-topic"
);
stream
.
print
();
env
.
execute
(
"启动主控 Job"
);
}
private
static
void
startHttpServer
()
{
port
(
19355
);
// Kafka producer configuration
Properties
props
=
new
Properties
();
props
.
put
(
ProducerConfig
.
BOOTSTRAP_SERVERS_CONFIG
,
kafkaServerAddress
);
props
.
put
(
ProducerConfig
.
KEY_SERIALIZER_CLASS_CONFIG
,
StringSerializer
.
class
.
getName
());
props
.
put
(
ProducerConfig
.
VALUE_SERIALIZER_CLASS_CONFIG
,
StringSerializer
.
class
.
getName
());
KafkaProducer
<
String
,
String
>
producer
=
new
KafkaProducer
<>(
props
);
post
(
"/submitE1Frame"
,
(
request
,
response
)
->
{
String
body
=
request
.
body
();
producer
.
send
(
new
ProducerRecord
<>(
originE1Topic
,
body
));
response
.
status
(
200
);
return
"主控收到 E1 数据"
;
});
}
}
wj-realtime-computing/src/main/resources/config_dev.properties
View file @
02a1ac3b
...
...
@@ -175,9 +175,9 @@ device.cross=13NI00B5RM0:13NI00B5RM0,13NGH0B5RC0:13NGH0B5RC0,13NF80B5QN0:13NF80B
# \u57CE\u5E02\u5927\u8111 laneId \u4E0E\u4E07\u96C6 laneId \u6620\u5C04
brain.wanji.lane.id
=
test:test
jdbc.driver
=
com.mysql.cj.jdbc.Driver
jdbc.url
=
jdbc:mysql://localhost:3306/beihang?userUnicode=true&characterEncoding=utf-8
jdbc.username
=
root
jdbc.password
=
123456
\ No newline at end of file
jdbc.password
=
123456
origin.e1.topic
=
origin-e1
\ No newline at end of file
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment