Commit 770502d2 authored by xuli's avatar xuli

首次提交

parents
This diff is collapsed.
### 国泰君安 日志、指标 Applets 小程序合集
![https://img.shields.io/badge/license-Apache%202.0-blue.svg?longCache=true&style=flat-square](https://img.shields.io/badge/license-Apache%202.0-blue.svg?longCache=true&style=flat-square)
![https://img.shields.io/badge/springcloud-Hoxton.RELEASE-yellow.svg?style=flat-square](https://img.shields.io/badge/springcloud-Hoxton.RELEASE-yellow.svg?style=flat-square)
![https://img.shields.io/badge/SpringCloudAlibaba-2.1.1.RELEASE-blueviolet.svg?style=flat-square](https://img.shields.io/badge/SpringCloudAlibaba-2.1.1.RELEASE-blueviolet.svg?style=flat-square)
![https://img.shields.io/badge/springboot-2.2.1.RELEASE-brightgreen.svg?style=flat-square](https://img.shields.io/badge/springboot-2.2.1.RELEASE-brightgreen.svg?style=flat-square)
### 项目地址
平台 | Zork-GTJA-Applets(后端)
---|---
GitHub | https://git.zorkdata.com/longliliang/zork-applets.git |
### 服务模块
Cloud模块:
服务名称 | 端口 | 描述
---|---|---
Junhiro-Trading-Count| 8201 | 统计君弘交易日早上9点至10点交易数据,对比昨天数据输出相差百分比
Log-Commission|8202 | commission 委托笔数查询
Metric-Omnichannel-Oracle|8203 | 全渠道客服、网上开户系统通过Oracle查询指标集
Metric-Zhtt|8204 | 查询mysql数据库指标数据
Zork-Cloud-Admin|8401|微服务监控子系统
### 目录结构
```
├─zork-applets ------ 整个项目的父模块
├─zork-common ------ 通用模块
├─zork-jar ------ 可执行程序模块
│ ├─zork-jar-htpocesb ------ 海通csv机器对应信息导入InfluxDB
├─zork-server ------ SpringBoot模块
│ ├─junhiro-trading-count ------ 统计君弘交易日早上9点至10点交易数据,对比昨天数据输出相差百分比
│ ├─log-commission ------ 委托笔数查询
│ ├─metric-omnichannel-oracle ------ 全渠道客服、网上开户系统通过Oracle查询指标集
│ ├─metric-zhtt ------ 查询mysql数据库指标数据
└─zork-cloud-admin ------ 微服务监控子系统
```
2021-09-26 13:25:07 INFO [Zork-Monitor-Admin,,,] main com.zork.ZorkMonitorAdminApplication Starting ZorkMonitorAdminApplication on DESKTOP-7KU01DL with PID 2332 (E:\zork\code\applets\zork-monitor\zork-monitor-admin\target\classes started by Prock.Liy in E:\zork\code\applets)
2021-09-26 13:25:07 INFO [Zork-Monitor-Admin,,,] main com.zork.ZorkMonitorAdminApplication No active profile set, falling back to default profiles: default
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.eclipse.jetty.util.log Logging initialized @5536ms to org.eclipse.jetty.util.log.Slf4jLog
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.springframework.boot.web.embedded.jetty.JettyServletWebServerFactory Server initialized with port: 8401
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.eclipse.jetty.server.Server jetty-9.4.20.v20190813; built: 2019-08-13T21:28:18.144Z; git: 84700530e645e812b336747464d6fbbf370c9a20; jvm 1.8.0_221-b11
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.eclipse.jetty.server.handler.ContextHandler.application Initializing Spring embedded WebApplicationContext
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.springframework.web.context.ContextLoader Root WebApplicationContext: initialization completed in 1490 ms
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.eclipse.jetty.server.session DefaultSessionIdManager workerName=node0
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.eclipse.jetty.server.session No SessionScavenger set, using defaults
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.eclipse.jetty.server.session node0 Scavenging every 600000ms
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.eclipse.jetty.server.handler.ContextHandler Started o.s.b.w.e.j.JettyEmbeddedWebAppContext@7a1f8def{application,/,[file:///C:/Users/Prock.Liy/AppData/Local/Temp/jetty-docbase.2938222063840967226.8401/],AVAILABLE}
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.eclipse.jetty.server.Server Started @5884ms
2021-09-26 13:25:09 INFO [Zork-Monitor-Admin,,,] main org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor Initializing ExecutorService 'applicationTaskExecutor'
2021-09-26 13:25:11 INFO [Zork-Monitor-Admin,,,] main org.springframework.boot.actuate.endpoint.web.EndpointLinksResolver Exposing 2 endpoint(s) beneath base path '/actuator'
2021-09-26 13:25:11 INFO [Zork-Monitor-Admin,,,] main org.eclipse.jetty.server.handler.ContextHandler.application Initializing Spring DispatcherServlet 'dispatcherServlet'
2021-09-26 13:25:11 INFO [Zork-Monitor-Admin,,,] main org.springframework.web.servlet.DispatcherServlet Initializing Servlet 'dispatcherServlet'
2021-09-26 13:25:11 INFO [Zork-Monitor-Admin,,,] main org.springframework.web.servlet.DispatcherServlet Completed initialization in 5 ms
2021-09-26 13:25:11 INFO [Zork-Monitor-Admin,,,] main org.eclipse.jetty.server.AbstractConnector Started ServerConnector@4fcc0416{HTTP/1.1,[http/1.1]}{0.0.0.0:8401}
2021-09-26 13:25:11 INFO [Zork-Monitor-Admin,,,] main org.springframework.boot.web.embedded.jetty.JettyWebServer Jetty started on port(s) 8401 (http/1.1) with context path '/'
2021-09-26 13:25:11 INFO [Zork-Monitor-Admin,,,] main com.zork.ZorkMonitorAdminApplication Started ZorkMonitorAdminApplication in 4.914 seconds (JVM running for 7.957)
2021-09-26 13:25:12 INFO [Zork-Monitor-Admin,,,] RMI TCP Connection(3)-10.199.162.1 org.eclipse.jetty.util.TypeUtil JVM Runtime does not support Modules
2021-09-26 13:29:08 INFO [Zork-Monitor-Admin,,,] parallel-11 de.codecentric.boot.admin.server.services.StatusUpdater Couldn't retrieve status for Instance(id=147bdeed8a9d, version=0, registration=Registration(name=metric-zhtt, managementUrl=http://DESKTOP-7KU01DL:8193/actuator, healthUrl=http://DESKTOP-7KU01DL:8193/actuator/health, serviceUrl=http://DESKTOP-7KU01DL:8193/, source=http-api), registered=true, statusInfo=StatusInfo(status=UNKNOWN, details={}), statusTimestamp=2021-09-26T05:28:58.482Z, info=Info(values={}), endpoints=Endpoints(endpoints={health=Endpoint(id=health, url=http://DESKTOP-7KU01DL:8193/actuator/health)}), buildVersion=null, tags=Tags(values={}))
java.util.concurrent.TimeoutException: Did not observe any item or terminal signal within 10000ms in 'map' (and no fallback has been configured)
at reactor.core.publisher.FluxTimeout$TimeoutMainSubscriber.handleTimeout(FluxTimeout.java:288)
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Error has been observed at the following site(s):
|_ checkpoint ⇢ Request to GET health [DefaultWebClient]
Stack trace:
at reactor.core.publisher.FluxTimeout$TimeoutMainSubscriber.handleTimeout(FluxTimeout.java:288)
at reactor.core.publisher.FluxTimeout$TimeoutMainSubscriber.doTimeout(FluxTimeout.java:273)
at reactor.core.publisher.FluxTimeout$TimeoutTimeoutSubscriber.onNext(FluxTimeout.java:390)
at reactor.core.publisher.StrictSubscriber.onNext(StrictSubscriber.java:89)
at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:73)
at reactor.core.publisher.MonoDelay$MonoDelayRunnable.run(MonoDelay.java:117)
at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:68)
at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2021-09-26 13:30:43 INFO [Zork-Monitor-Admin,,,] reactor-http-nio-3 de.codecentric.boot.admin.server.services.StatusUpdater Couldn't retrieve status for Instance(id=147bdeed8a9d, version=3, registration=Registration(name=metric-zhtt, managementUrl=http://DESKTOP-7KU01DL:8193/actuator, healthUrl=http://DESKTOP-7KU01DL:8193/actuator/health, serviceUrl=http://DESKTOP-7KU01DL:8193/, source=http-api), registered=true, statusInfo=StatusInfo(status=UP, details={}), statusTimestamp=2021-09-26T05:29:21.911Z, info=Info(values={}), endpoints=Endpoints(endpoints={health=Endpoint(id=health, url=http://DESKTOP-7KU01DL:8193/actuator/health), info=Endpoint(id=info, url=http://DESKTOP-7KU01DL:8193/actuator/info)}), buildVersion=null, tags=Tags(values={}))
io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: no further information: DESKTOP-7KU01DL/10.199.162.1:8193
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Error has been observed at the following site(s):
|_ checkpoint ⇢ Request to GET health [DefaultWebClient]
Stack trace:
Caused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:327)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:688)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:635)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:552)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:514)
at io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://maven.apache.org/POM/4.0.0"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.zork</groupId>
<artifactId>zork-applets</artifactId>
<version>2.2-RELEASE</version>
<packaging>pom</packaging>
<name>Zork-Applets</name>
<description>Applets: 小程序与静态jar,服务</description>
<modules>
<module>../zork-server</module>
<module>../zork-jar</module>
<module>../zork-common</module>
<module>../zork-monitor</module>
</modules>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<!--升级 -->
<version>2.2.0.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<properties>
<java.version>1.8</java.version>
<!--升级 -->
<spring-cloud.version>Hoxton.RELEASE</spring-cloud.version>
<!--升级 -->
<com-alibaba-cloud.version>2.1.1.RELEASE</com-alibaba-cloud.version>
<codingapi.txlcn.version>5.0.2.RELEASE</codingapi.txlcn.version>
<fastjson.version>2.0.5</fastjson.version>
<xml-api.version>1.4.01</xml-api.version>
<ip2region.version>1.7</ip2region.version>
<guava.version>27.0-jre</guava.version>
<excelkit.version>2.0.71</excelkit.version>
<mybatis-plus.version>3.2.0</mybatis-plus.version>
<dynamic-datasource.version>2.5.7</dynamic-datasource.version>
<p6spy.version>3.8.5</p6spy.version>
<spring-boot-admin.version>2.2.0</spring-boot-admin.version>
<easy-captcha.version>1.6.2</easy-captcha.version>
<logstash-logback-encoder.version>6.1</logstash-logback-encoder.version>
<justauth.version>1.1.0</justauth.version>
<jjwt.version>0.9.1</jjwt.version>
<knife4j.version>2.0.2</knife4j.version>
<springfox.version>2.9.2</springfox.version>
<swagger.version>1.5.21</swagger.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${spring-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<!-- 升级,替换为 com.alibaba.cloud -->
<dependency>
<groupId>com.alibaba.cloud</groupId>
<artifactId>spring-cloud-alibaba-dependencies</artifactId>
<version>${com-alibaba-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
</project>
\ No newline at end of file
<?xml version="1.0" encoding="UTF-8"?>
<module org.jetbrains.idea.maven.project.MavenProjectsManager.isMavenModule="true" type="JAVA_MODULE" version="4">
<component name="NewModuleRootManager" LANGUAGE_LEVEL="JDK_1_8">
<output url="file://$MODULE_DIR$/target/classes" />
<output-test url="file://$MODULE_DIR$/target/test-classes" />
<content url="file://$MODULE_DIR$">
<excludeFolder url="file://$MODULE_DIR$/target" />
</content>
<orderEntry type="inheritedJdk" />
<orderEntry type="sourceFolder" forTests="false" />
</component>
</module>
\ No newline at end of file
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<artifactId>zork-applets</artifactId>
<groupId>com.zork</groupId>
<version>2.2-RELEASE</version>
<relativePath>../zork-applets/pom.xml</relativePath>
</parent>
<artifactId>zork-common</artifactId>
<name>Zork-Common</name>
<description>Common通用模块</description>
<dependencies>
<dependency>
<groupId>com.belerweb</groupId>
<artifactId>pinyin4j</artifactId>
<version>2.5.1</version>
</dependency>
<!-- 用于解析yaml文件-->
<dependency>
<groupId>org.yaml</groupId>
<artifactId>snakeyaml</artifactId>
<version>1.25</version>
</dependency>
<dependency>
<groupId>org.influxdb</groupId>
<artifactId>influxdb-java</artifactId>
<version>2.10</version>
</dependency>
<dependency>
<groupId>com.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>5.0</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</dependency>
<!-- 数据库连接池 -->
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid</artifactId>
<version>1.1.20</version>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.83</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--存放Token-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-redis</artifactId>
</dependency>
<!-- 升级 -->
<dependency>
<groupId>de.codecentric</groupId>
<artifactId>spring-boot-admin-starter-client</artifactId>
<version>2.2.0</version>
</dependency>
<!-- 自定义配置类还需引入spring-boot-configuration-processor依赖-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-configuration-processor</artifactId>
<optional>true</optional>
</dependency>
<!-- Hystrix依赖-->
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-hystrix</artifactId>
</dependency>
<!-- mysql连接-->
<dependency>
<groupId>com.baomidou</groupId>
<artifactId>mybatis-plus-boot-starter</artifactId>
<version>3.2.0</version>
</dependency>
<!-- prometheus-->
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-registry-prometheus</artifactId>
</dependency>
<!-- ojdbc6.jar 对应 11g -->
<!-- https://mvnrepository.com/artifact/oracle/ojdbc6 -->
<!-- <dependency>-->
<!-- <groupId>com.oracle</groupId>-->
<!-- <artifactId>ojdbc6</artifactId>-->
<!-- <version>11.2.0.3</version>-->
<!-- </dependency>-->
<!-- 数据库连接池 -->
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid-spring-boot-starter</artifactId>
<version>1.1.10</version>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>2.3.1.RELEASE</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-yaml</artifactId>
<version>2.9.8</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.8.2.1</version>
</dependency>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.1.0</version>
</dependency>
</dependencies>
</project>
package com.zork.common;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
public class CommonApplication {
public static void main(String[] args) {
SpringApplication.run(CommonApplication.class, args);
}
}
package com.zork.common.configure;
import com.fasterxml.jackson.annotation.JsonAutoDetect;
import com.fasterxml.jackson.annotation.PropertyAccessor;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.zork.common.service.RedisService;
import org.springframework.boot.autoconfigure.condition.ConditionalOnBean;
import org.springframework.boot.autoconfigure.condition.ConditionalOnClass;
import org.springframework.context.annotation.Bean;
import org.springframework.data.redis.connection.RedisConnectionFactory;
import org.springframework.data.redis.core.RedisOperations;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.data.redis.serializer.Jackson2JsonRedisSerializer;
import org.springframework.data.redis.serializer.StringRedisSerializer;
/**
* @Author Prock.Liy
* @Date 2020/10/11 21:07
* @Description
**/
public class ZorkLettuceRedisConfigure {
/**
* 指定key序列化策略采用StringRedisSerializer。value序列化策略采用Jackson2JsonRedisSerializer
* @param factory
* @return
*/
@Bean
@ConditionalOnClass(RedisOperations.class)
public RedisTemplate<String, Object> redisTemplate(RedisConnectionFactory factory) {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(factory);
Jackson2JsonRedisSerializer<Object> jackson2JsonRedisSerializer = new Jackson2JsonRedisSerializer<>(Object.class);
ObjectMapper mapper = new ObjectMapper();
mapper.setVisibility(PropertyAccessor.ALL, JsonAutoDetect.Visibility.ANY);
mapper.enableDefaultTyping(ObjectMapper.DefaultTyping.NON_FINAL);
jackson2JsonRedisSerializer.setObjectMapper(mapper);
StringRedisSerializer stringRedisSerializer = new StringRedisSerializer();
// key采用 String的序列化方式
template.setKeySerializer(stringRedisSerializer);
// hash的 key也采用 String的序列化方式
template.setHashKeySerializer(stringRedisSerializer);
// value序列化方式采用 jackson
template.setValueSerializer(jackson2JsonRedisSerializer);
// hash的 value序列化方式采用 jackson
template.setHashValueSerializer(jackson2JsonRedisSerializer);
template.afterPropertiesSet();
return template;
}
/**
* 将上面定义的RedisService注册到IOC容器中,前提是IOC容器里存在名称为redisTemplate的Bean
* @return
*/
@Bean
@ConditionalOnBean(name = "redisTemplate")
public RedisService redisService() {
return new RedisService();
}
}
package com.zork.common.constant;
/**
* @Author Prock.Liy
* @Date 2020/10/10 15:43
* @Description 微服务名称常量类
**/
public class DateConstant {
public static final String FULL_TIME_PATTERN = "yyyyMMddHHmmss";
public static final String FULL_TIME_SPLIT_PATTERN = "yyyy-MM-dd HH:mm:ss";
public static final String CST_TIME_PATTERN = "EEE MMM dd HH:mm:ss zzz yyyy";
public static final String SS_XXX_TIME_08 = "yyyy-MM-dd'T'HH:mm:ssXXX";
public static final String STANDARD_DATE_FORMAT_UTC = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'";
}
package com.zork.common.constant;
/**
* @Author: Prock.Liy
* @Date: 2021/9/6
* @Description:dimension常用魔法值
*/
public class DimensionConstant {
public static final String APP_SYSTEM = "appsystem";
public static final String LOG_TYPE_NAME = "logTypeName";
public static final String INDEX = "index";
}
package com.zork.common.constant;
/**
* @Author Prock.Liy
* @Date 2020/10/10 15:43
* @Description 微服务名称常量类
**/
public class InfluxDBConstant {
public static final String AUTOGEN = "autogen";
}
package com.zork.common.dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class Dimension{
private String appsystem;
private String hostname;
private String appprogramname;
private String ip;
}
package com.zork.common.dto;
/**
* @Author: Prock.Liy
* @Date: 2021/6/4
* @Description:
*/
public class ExecuteResultDTO {
private String taskStepId;
private boolean isSucceed = false;
private Integer status;
private String logs;
public String getTaskStepId() {
return taskStepId;
}
public void setTaskStepId(String taskStepId) {
this.taskStepId = taskStepId;
}
public boolean isSucceed() {
return isSucceed;
}
public void setSucceed(boolean succeed) {
isSucceed = succeed;
}
public Integer getStatus() {
return status;
}
public void setStatus(Integer status) {
this.status = status;
}
public String getLog() {
return logs.replaceAll("\r","");
}
public void setLog(String log) {
this.logs = log;
}
@Override
public String toString() {
return "ExecuteResult{" +
"taskStepId='" + taskStepId + '\'' +
", isSucceed=" + isSucceed +
", status='" + status + '\'' +
", logs='" + logs + '\'' +
'}';
}
}
package com.zork.common.dto;
import lombok.Data;
import java.util.List;
/**
* @Author: Prock.Liy
* @Date: 2021/7/21
* @Description:
*/
@Data
public class KeyValueDTO {
private List<String> keyList;
private List<String> valueList;
}
package com.zork.common.dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import org.bouncycastle.cms.PasswordRecipientId;
import java.util.Date;
import java.util.Map;
/**
* @Author: Prock.Liy
* @Date: 2021/9/6
* @Description: 指标结构数据集
*/
@Builder
@Data
@NoArgsConstructor
@AllArgsConstructor
public class MetricDTO {
private String metricsetname;
private Map<String,String> dimensions;
private Map<String,String> measures;
private Long timestamp;
private String time;
}
package com.zork.common.dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.Map;
/**
* @Author: Prock.Liy
* @Date: 2021/9/6
* @Description: 日志集数据格式
*/
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class NormalFieldsDTO {
private String logTypeName;
private Dimension dimensions;
private Map<String,String> normalFields;
private Map<String,Double> measures;
private String source;
private String offset;
private Long timestamp;
}
package com.zork.common.entity;
import lombok.Data;
import java.io.Serializable;
/**
* @Author Prock.Liy
* @Date 2020/10/11 22:28
* @Description
**/
@Data
public class QueryRequest implements Serializable {
private static final long serialVersionUID = -4869594085374385813L;
/**
* 当前页面数据量
*/
private int pageSize = 10;
/**
* 当前页码
*/
private int pageNum = 1;
/**
* 排序字段
*/
private String field;
/**
* 排序规则,asc升序,desc降序
*/
private String order;
}
package com.zork.common.entity;
import lombok.Data;
import java.util.HashMap;
/**
* @Author Prock.Liy
* @Date 2020/9/14 22:50
* @Description
**/
@Data
public class ZorkResponse extends HashMap<String, Object> {
private static final long serialVersionUID = -8713837118340960775L;
public ZorkResponse message(String message) {
this.put("message", message);
return this;
}
public ZorkResponse data(Object data) {
this.put("data", data);
return this;
}
@Override
public ZorkResponse put(String key, Object value) {
super.put(key, value);
return this;
}
public String getMessage() {
return String.valueOf(get("message"));
}
public Object getData() {
return get("data");
}
}
package com.zork.common.exception;
/**
* @Author Prock.Liy
* @Date 2020/10/11 22:37
* @Description
**/
public class ZorkException extends Exception{
private static final long serialVersionUID = -6916154462432027437L;
public ZorkException(String message){
super(message);
}
}
package com.zork.common.handler;
import com.zork.common.entity.ZorkResponse;
import com.zork.common.exception.ZorkException;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;
import org.springframework.http.HttpStatus;
import org.springframework.validation.BindException;
import org.springframework.validation.FieldError;
import org.springframework.web.bind.annotation.ExceptionHandler;
import org.springframework.web.bind.annotation.ResponseStatus;
import javax.security.auth.message.AuthException;
import javax.validation.ConstraintViolation;
import javax.validation.ConstraintViolationException;
import javax.validation.Path;
import java.nio.file.AccessDeniedException;
import java.util.List;
import java.util.Set;
/**
* @Author Prock.Liy
* @Date 2020/10/10 14:15
* @Description 全局异常捕获
**/
@Slf4j
public class BaseExceptionHandler {
@ExceptionHandler(value = Exception.class)
@ResponseStatus(HttpStatus.INTERNAL_SERVER_ERROR)
public ZorkResponse handleException(Exception e) {
log.error("系统内部异常,异常信息", e);
return new ZorkResponse().message("系统内部异常");
}
@ExceptionHandler(value = AuthException.class)
@ResponseStatus(HttpStatus.INTERNAL_SERVER_ERROR)
public ZorkResponse handleCloudAuthException(AuthException e) {
log.error("系统错误", e);
return new ZorkResponse().message(e.getMessage());
}
@ExceptionHandler(value = AccessDeniedException.class)
@ResponseStatus(HttpStatus.FORBIDDEN)
public ZorkResponse handleAccessDeniedException(){
return new ZorkResponse().message("没有权限访问该资源");
}
@ExceptionHandler(value = ZorkException.class)
@ResponseStatus(HttpStatus.INTERNAL_SERVER_ERROR)
public ZorkResponse handleCloudException(ZorkException e) {
log.error("系统错误", e);
return new ZorkResponse().message(e.getMessage());
}
/**
* 统一处理请求参数校验(普通传参)
*
* @param e ConstraintViolationException
* @return CloudResponse
*/
@ExceptionHandler(value = ConstraintViolationException.class)
@ResponseStatus(HttpStatus.BAD_REQUEST)
public ZorkResponse handleConstraintViolationException(ConstraintViolationException e) {
StringBuilder message = new StringBuilder();
Set<ConstraintViolation<?>> violations = e.getConstraintViolations();
for (ConstraintViolation<?> violation : violations) {
Path path = violation.getPropertyPath();
String[] pathArr = StringUtils.splitByWholeSeparatorPreserveAllTokens(path.toString(), ".");
message.append(pathArr[1]).append(violation.getMessage()).append(",");
}
message = new StringBuilder(message.substring(0, message.length() - 1));
return new ZorkResponse().message(message.toString());
}
/**
* 统一处理请求参数校验(实体对象传参)
*
* @param e BindException
* @return CloudResponse
*/
@ExceptionHandler(BindException.class)
@ResponseStatus(HttpStatus.BAD_REQUEST)
public ZorkResponse handleBindException(BindException e) {
StringBuilder message = new StringBuilder();
List<FieldError> fieldErrors = e.getBindingResult().getFieldErrors();
for (FieldError error : fieldErrors) {
message.append(error.getField()).append(error.getDefaultMessage()).append(",");
}
message = new StringBuilder(message.substring(0, message.length() - 1));
return new ZorkResponse().message(message.toString());
}
}
package com.zork.common.handler;
import lombok.extern.slf4j.Slf4j;
import java.util.function.Consumer;
/**
* @Author: Prock.Liy
* @Date: 2021/6/2
* @Description:
*/
@Slf4j
public class ConsumerWrapperHandler {
/**
* 处理stream流中异常
* @param throwingConsumer
* @param <T>
* @return
*/
public static <T> Consumer<T> throwingConsumerWrapper(
com.zork.common.handler.ThrowingConsumer<T, Exception> throwingConsumer) {
return i -> {
try {
throwingConsumer.accept(i);
} catch (Exception ex) {
ex.printStackTrace();
log.error("\n Stream Exception: -> {}", ex.getMessage());
}
};
}
}
package com.zork.common.handler;
/**
* @Author: Prock.Liy
* @Date: 2021/6/2
* @Description:
*/
@FunctionalInterface
public interface ThrowingConsumer<T, E extends Exception>{
void accept(T t) throws E;
}
package com.zork.common.service;
import lombok.Data;
import lombok.extern.slf4j.Slf4j;
import org.influxdb.InfluxDB;
import org.influxdb.InfluxDBFactory;
import org.influxdb.dto.BatchPoints;
import org.influxdb.dto.Point;
import org.influxdb.dto.Query;
import org.influxdb.dto.QueryResult;
import java.util.List;
import java.util.Map;
import java.util.concurrent.TimeUnit;
/**
* @Author: Prock.Liy
* @Date: 2021/6/16
* @Description: InfluxDB公共类
*/
@Data
@Slf4j
public class InfluxDBService {
private String username;// 用户名
private String password;// 密码
private String openurl;// 连接地址
private String database;// 数据库
private InfluxDB influxDB;
public InfluxDBService(String username, String password, String openurl, String database) {
this.username = username;
this.password = password;
this.openurl = openurl;
this.database = database;
}
/**
* 连接时序数据库;获得InfluxDB
**/
public InfluxDB influxDbBuild() {
if (influxDB == null) {
influxDB = InfluxDBFactory.connect(openurl, username, password);
influxDB.createDatabase(database);
}
return influxDB;
}
/**
* 设置数据保存策略 defalut 策略名 /database 数据库名/ 30d 数据保存时限30天/ 1 副本个数为1/ 结尾DEFAULT
* 表示 设为默认的策略
*/
public void createRetentionPolicy(String policy) {
String command = String.format("CREATE RETENTION POLICY \"%s\" ON \"%s\" DURATION %s REPLICATION %s DEFAULT",
policy, database, "7200d", 1);
this.query(command);
}
/**
* 查询
*
* @param command 查询语句
* @return
*/
public QueryResult query(String command) {
return influxDB.query(new Query(command, database));
}
public QueryResult query(String command, TimeUnit unit) {
return influxDB.query(new Query(command, database), unit);
}
/**
* 插入
*
* @param measurement 表
* @param tags 标签
* @param fields 字段
*/
public void insert(String measurement, Map<String, String> tags, Map<String, Object> fields, long time) {
Point.Builder builder = Point.measurement(measurement);
builder.time(time, TimeUnit.MILLISECONDS);
builder.tag(tags);
builder.fields(fields);
influxDB.write(database, "", builder.build());
}
/**
* 插入
*
* @param measurement 表
* @param tags 标签
* @param fields 字段
*/
public void insert(String measurement, Map<String, String> tags, Map<String, Object> fields) {
Point.Builder builder = Point.measurement(measurement);
builder.tag(tags);
builder.fields(fields);
influxDB.write(database, "", builder.build());
}
/**
* 插入
*
* @param point 批量插入
*/
public void pointInsert(String autogen, Point point) {
influxDB.write(database, "", point);
}
/**
* 插入
*
* @param batchPoints 批量插入
*/
public void batchInsert(BatchPoints batchPoints) {
influxDB.write(batchPoints);
}
/**
* 分批次批量插入
*
* @param pointList 数据
* @param batchSize size
*/
public void insertBatch(List<Point> pointList, int batchSize) {
Long startTime = System.currentTimeMillis();
BatchPoints batchPoints = BatchPoints
.database(database)
.consistency(InfluxDB.ConsistencyLevel.ALL)
.build();
// 控制循环,批量存储,每10万条数据存储一次
int i = 0;
for (Point point : pointList) {
batchPoints.point(point);
i++;
// 每读取十万条数据提交到influxdb存储一次
if (i / batchSize == 1) {
i = 0;
influxDB.write(batchPoints);
batchPoints = BatchPoints
.database(database)
.consistency(InfluxDB.ConsistencyLevel.ALL)
.build();
}
}
influxDB.write(batchPoints);
Long endTime = System.currentTimeMillis();
// log.info("InfluxDB import time-consuming:【\033[31m{}\033[0m】", DateUtil.timeConsuming(startTime, endTime));
}
/**
* 删除
*
* @param command 删除语句
* @return 返回错误信息
*/
public String deleteMeasurementData(String command) {
QueryResult result = influxDB.query(new Query(command, database));
return result.getError();
}
/**
* 创建数据库
*
* @param dbName
*/
public void createDB(String dbName) {
influxDB.createDatabase(dbName);
}
/**
* 删除数据库
*
* @param dbName
*/
public void deleteDB(String dbName) {
influxDB.deleteDatabase(dbName);
}
/**
* 打印结果
*
* @param queryResult
*/
public static QueryResult.Series getResult(QueryResult queryResult) {
List<QueryResult.Result> results = queryResult.getResults();
if (results == null || results.get(0).getSeries() == null) {
return null;
}
// 单条sql执行后返回结果为index[0]
QueryResult.Result result = results.get(0);
return result.getSeries().isEmpty() ? null : result.getSeries().get(0);
}
/**
* 关闭连接
*/
public void close(){
influxDB.close();
}
}
This diff is collapsed.
package com.zork.common.utils;
import com.opencsv.CSVReader;
import com.opencsv.exceptions.CsvValidationException;
import com.zork.common.dto.KeyValueDTO;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang.StringUtils;
import java.io.*;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
/**
* @Author: Prock.Liy
* @Date: 2021/7/21
* @Description:
*/
@Slf4j
public class FileUtil {
/**
* 写文件
*
* @param fileName
* @param content
*/
public static void appendMethodB(String fileName, String content, boolean isAppend) {
try {
// 打开一个写文件器,构造函数中的第二个参数true表示以追加形式写文件,如果为 true,则将字节写入文件末尾处,而不是写入文件开始处
FileWriter writer = new FileWriter(fileName, isAppend);
if (isAppend = true) {
writer.write(content + "\n");
} else {
writer.write(content);
}
writer.flush();
writer.close();
} catch (IOException e) {
log.error("\n 文件写入异常: -> {}", e.getMessage());
}
}
/**
* 获取文件行数
*
* @param filePath filePath
* @return Integer
*/
public static Integer readerFileRow(String filePath) {
int linenumber = 0;
try {
File file = new File(filePath);
if (file.exists()) {
FileReader fr = new FileReader(file);
LineNumberReader lnr = new LineNumberReader(fr);
while (lnr.readLine() != null) {
linenumber++;
}
lnr.close();
} else {
log.error("File does not exists!");
}
} catch (IOException e) {
e.printStackTrace();
log.error("readerFileRow Exception!");
}
return linenumber;
}
/**
* 读取csv文件
*
* @param csvFile 路径
*/
public static KeyValueDTO readCsv(String csvFile) {
KeyValueDTO keyValueDTO = new KeyValueDTO();
CSVReader reader = null;
try {
reader = new CSVReader(new FileReader(csvFile));
String[] line;
String key = StringUtils.EMPTY;
int index = 0;
List<String> valueList = new ArrayList<>();
while ((line = reader.readNext()) != null) {
// 获取key
if (index == 0) {
key = StringUtils
.substringAfter(String.join("; ", line), ";")
.replaceAll("\"", "").trim();
keyValueDTO.setKeyList(Arrays.asList(key.split(";")));
}
// 存储value
if (index != 0) {
valueList.add(String.join(";", line));
}
index++;
}
keyValueDTO.setValueList(valueList);
} catch (IOException | CsvValidationException e) {
log.error("解析文件数据出错,本次导入InfluxDB任务退出!,Error:【\033[31m{}\033[0m】", e.getMessage());
}
return keyValueDTO;
}
/**
* 获取文件
*/
public static class MyExtFilter implements FilenameFilter {
private final String ext;
public MyExtFilter(String ext) {
this.ext = ext;
}
public boolean accept(File dir, String name) {
return name.endsWith(ext);
}
}
/**
* 读取文件内容
*
* @param filePath
* @return
*/
public static String readFile(String filePath) {
//java 8中这样写也可以
StringBuffer buffer = new StringBuffer();
try (BufferedReader br = Files.newBufferedReader(Paths.get(filePath))) {
String line;
while ((line = br.readLine()) != null) {
buffer.append(line + "\n");
}
} catch (Exception e) {
log.error("readFile Error:{} , filePath:{}", e.getMessage(), filePath);
}
return buffer.toString();
}
/**
* 根据txt文件判断是否为交易日
*
* @param filePath
* @return
*/
public static boolean judgeTradingDay(String filePath, String today) {
try (BufferedReader br = Files.newBufferedReader(Paths.get(filePath))) {
String day;
while ((day = br.readLine()) != null) {
if (today.equals(day)) {
return true;
}
}
} catch (Exception e) {
log.error("judgeTradingDay Error , filePath:{} Error: {}", filePath, e.getMessage());
}
return false;
}
}
package com.zork.common.utils;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.function.Function;
import java.util.function.Predicate;
/**
* 集合公共类
* @Date 2022.06.21 16:53
* @author Prock.Liy
*/
public class ListUtil {
/**
* 根据特定值去重
*
* @param keyExtractor
* @param <T>
* @return
*/
public static <T> Predicate<T> distinctByKey(Function<? super T, Object> keyExtractor) {
Map<Object, Boolean> map = new ConcurrentHashMap<>();
return t -> map.putIfAbsent(keyExtractor.apply(t), Boolean.TRUE) == null;
}
}
package com.zork.common.utils;
import com.alibaba.fastjson.JSONArray;
import com.alibaba.fastjson.JSONObject;
import lombok.extern.slf4j.Slf4j;
import org.springframework.http.*;
import org.springframework.web.client.RestTemplate;
import static com.zork.common.utils.DateUtil.yearMonthDayPoint;
/**
* RestTemplate 远程调用公共类
*
* @author Prock.Liy
*/
@Slf4j
public class RestTemplateUtil {
private static final RestTemplate restTemplate = new RestTemplate();
// --------------------------------------- GET -------------------------------------------------------------
/**
* 返回jsonarray
* @param url
* @return
*/
public static JSONArray getJSONArray(String url) {
JSONArray jsonArray = new JSONArray();
try {
jsonArray = restTemplate.exchange(url, HttpMethod.GET,null, JSONArray.class).getBody();
}catch (Exception e){
log.error("getJSONArray RestTemplate Error:{}",e.getMessage());
}
return jsonArray;
}
// --------------------------------------- POST -------------------------------------------------------------
/**
* postJSONObject
* @param url
* @return
*/
public static JSONObject postJSONObject(String url,JSONObject param) {
JSONObject jsonObject = new JSONObject();
try {
HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.APPLICATION_JSON);
HttpEntity entity = new HttpEntity<>(param, headers);
ResponseEntity<JSONObject> result = restTemplate.exchange(String.format(url, yearMonthDayPoint()), HttpMethod.POST, entity, JSONObject.class);
jsonObject = result.getBody();
}catch (Exception e){
log.error("postJSONObject RestTemplate Error:{}",e.getMessage());
}
return jsonObject;
}
}
package com.zork.common.utils;
import java.io.IOException;
import java.io.LineNumberReader;
import java.io.PrintWriter;
import java.io.Reader;
import java.sql.*;
/**
* @Author: Prock.Liy
* @Date: 2021/6/4
* @Description:
*/
public class ScriptRunner {
private static final String DEFAULT_DELIMITER = ";";
private final Connection connection;
private final boolean stopOnError;
private final boolean autoCommit;
private PrintWriter logWriter = new PrintWriter(System.out);
private PrintWriter errorLogWriter = new PrintWriter(System.err);
private String delimiter = DEFAULT_DELIMITER;
private boolean fullLineDelimiter = false;
/**
* Default constructor
*/
public ScriptRunner(Connection connection, boolean autoCommit,
boolean stopOnError) {
this.connection = connection;
this.autoCommit = autoCommit;
this.stopOnError = stopOnError;
}
public void setDelimiter(String delimiter, boolean fullLineDelimiter) {
this.delimiter = delimiter;
this.fullLineDelimiter = fullLineDelimiter;
}
/**
* Setter for logWriter property
*
* @param logWriter
* - the new value of the logWriter property
*/
public void setLogWriter(PrintWriter logWriter) {
this.logWriter = logWriter;
}
/**
* Setter for errorLogWriter property
*
* @param errorLogWriter
* - the new value of the errorLogWriter property
*/
public void setErrorLogWriter(PrintWriter errorLogWriter) {
this.errorLogWriter = errorLogWriter;
}
/**
* Runs an SQL script (read in using the Reader parameter)
*
* @param reader
* - the source of the script
*/
public void runScript(Reader reader) throws IOException, SQLException {
try {
boolean originalAutoCommit = connection.getAutoCommit();
try {
if (originalAutoCommit != this.autoCommit) {
connection.setAutoCommit(this.autoCommit);
}
runScript(connection, reader);
} finally {
connection.setAutoCommit(originalAutoCommit);
}
} catch (IOException e) {
throw e;
} catch (SQLException e) {
throw e;
} catch (Exception e) {
throw new RuntimeException("Error running script. Cause: " + e, e);
}
}
/**
* Runs an SQL script (read in using the Reader parameter) using the
* connection passed in
*
* @param conn
* - the connection to use for the script
* @param reader
* - the source of the script
* @throws SQLException
* if any SQL errors occur
* @throws IOException
* if there is an error reading from the Reader
*/
private void runScript(Connection conn, Reader reader) throws IOException,
SQLException {
StringBuffer command = null;
try {
LineNumberReader lineReader = new LineNumberReader(reader);
String line = null;
while ((line = lineReader.readLine()) != null) {
if (command == null) {
command = new StringBuffer();
}
String trimmedLine = line.trim();
if (trimmedLine.startsWith("--")) {
println(trimmedLine);
} else if (trimmedLine.length() < 1
|| trimmedLine.startsWith("//")) {
// Do nothing
} else if (trimmedLine.length() < 1
|| trimmedLine.startsWith("--")) {
// Do nothing
} else if (!fullLineDelimiter
&& trimmedLine.endsWith(getDelimiter())
|| fullLineDelimiter
&& trimmedLine.equals(getDelimiter())) {
command.append(line, 0, line
.lastIndexOf(getDelimiter()));
command.append(" ");
Statement statement = conn.createStatement();
println(command);
boolean hasResults = false;
if (stopOnError) {
hasResults = statement.execute(command.toString());
} else {
try {
statement.execute(command.toString());
} catch (SQLException e) {
e.fillInStackTrace();
printlnError("Error executing: " + command);
printlnError(e);
}
}
if (autoCommit && !conn.getAutoCommit()) {
conn.commit();
}
ResultSet rs = statement.getResultSet();
if (hasResults && rs != null) {
ResultSetMetaData md = rs.getMetaData();
int cols = md.getColumnCount();
for (int i = 0; i < cols; i++) {
String name = md.getColumnLabel(i);
print(name + "\t");
}
println("");
while (rs.next()) {
for (int i = 0; i < cols; i++) {
String value = rs.getString(i);
print(value + "\t");
}
println("");
}
}
command = null;
try {
statement.close();
} catch (Exception e) {
// Ignore to workaround a bug in Jakarta DBCP
}
Thread.yield();
} else {
command.append(line);
command.append(" ");
}
}
if (!autoCommit) {
conn.commit();
}
} catch (SQLException e) {
e.fillInStackTrace();
printlnError("Error executing: " + command);
printlnError(e);
throw e;
} catch (IOException e) {
e.fillInStackTrace();
printlnError("Error executing: " + command);
printlnError(e);
throw e;
} finally {
conn.rollback();
flush();
}
}
private String getDelimiter() {
return delimiter;
}
private void print(Object o) {
if (logWriter != null) {
System.out.print(o);
}
}
private void println(Object o) {
if (logWriter != null) {
logWriter.println(o);
}
}
private void printlnError(Object o) {
if (errorLogWriter != null) {
errorLogWriter.println(o);
}
}
private void flush() {
if (logWriter != null) {
logWriter.flush();
}
if (errorLogWriter != null) {
errorLogWriter.flush();
}
}
}
package com.zork.common.utils;
import com.zork.common.dto.ExecuteResultDTO;
import lombok.extern.slf4j.Slf4j;
import org.apache.ibatis.io.Resources;
import org.apache.ibatis.jdbc.ScriptRunner;
import java.io.*;
import java.nio.charset.StandardCharsets;
import java.sql.Connection;
import java.sql.DriverManager;
/**
* @Author: Prock.Liy
* @Date: 2021/6/4
* @Description:
*/
@Slf4j
public class SqlFileExecuteUtil {
/**
* 执行Sql文件
* @param taskStepId 任务号ID
* @param url 数据库URL
* @param userName 用户名
* @param userPassword 密码
* @param filePath Sql文件Path
* @return Execute
* @throws IOException
*/
public static ExecuteResultDTO executeSql(String taskStepId, String url, String userName, String userPassword, String filePath) throws IOException {
ExecuteResultDTO executeResult = new ExecuteResultDTO();
executeResult.setTaskStepId(taskStepId);
// sql执行输出流
StringWriter succeedWriter = new StringWriter();
PrintWriter succeedOut = new PrintWriter(succeedWriter);
StringWriter errorWriter = new StringWriter();
PrintWriter errorOut = new PrintWriter(errorWriter);
Connection conn = null;
ScriptRunner runner;
try {
conn = getMySqlConnection(url, userName, userPassword);
} catch (Exception e) {
e.printStackTrace();
executeResult.setStatus(1);
executeResult.setLog(e.getStackTrace().toString());
e.printStackTrace();
log.error("ExecuteSqlException: "+e.getMessage());
try {
conn.close();
} catch (Exception ee) {
ee.printStackTrace();
log.error("ExecuteSqlException: "+url + "关闭连接错误!");
}
return executeResult;
}
runner = new ScriptRunner(conn);
// 设置字符集,不然中文乱码插入错误
Resources.setCharset(StandardCharsets.UTF_8);
runner.setAutoCommit(false);
runner.setSendFullScript(true);
// 设置日志
runner.setLogWriter(succeedOut);
runner.setErrorLogWriter(errorOut);
// 遇到错误停止
runner.setStopOnError(true);
// 绝对路径读取
Reader read = new FileReader(filePath);
try {
runner.runScript(read);
} catch (Exception e) {
e.printStackTrace();
executeResult.setStatus(1);
executeResult.setLog(errorWriter.toString());
log.error("ExecuteSqlException: "+e.getMessage());
return executeResult;
} finally {
try {
runner.closeConnection();
conn.close();
} catch (Exception e) {
e.printStackTrace();
log.error("ExecuteSqlException: "+url + "关闭连接错误!");
}
}
executeResult.setSucceed(true);
executeResult.setLog(succeedWriter.getBuffer().toString());
executeResult.setStatus(0);
return executeResult;
}
/**
* @return
* @throws Exception
* @功能描述: 获取数据库连接
*/
public static Connection getMySqlConnection(String url, String userName, String userPassword) throws Exception {
Class.forName("com.microsoft.sqlserver.jdbc.SQLServerDriver");
return DriverManager.getConnection(url, userName, userPassword);
}
}
package com.zork.common.utils;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
/**
* @Author: Prock.Liy
* @Date: 2021/6/18
* @Description:
*/
public class StringUtil {
/**
* 截取字符串指定字符指定次数出现位置后数据
* @param value 字符串
* @param index 次数
* @return
*/
public static String stringIntercept(String value,Integer index) {
try {
Pattern pattern = Pattern.compile("\n");
Matcher findMatcher = pattern.matcher(value);
int number = 0;
while(findMatcher.find()) {
number++;
// 找到出现次数break
if(number == index){
break;
}
}
int i = findMatcher.start();
return value.substring(i);
}catch (Exception e){
e.printStackTrace();
}
return null;
}
}
package com.zork.common.utils;
import org.yaml.snakeyaml.Yaml;
import java.io.File;
import java.io.FileReader;
import java.io.Reader;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
/**
* @Author: Prock.Liy
* @Date: 2021/10/18
* @Description: yml解析工具类
*/
public class YamlUtil {
/**
* 获取yml文件中的指定字段,返回一个map
*
* @param sourcename
* @return
*/
public static Map<String, Object> getResMap(String path,String sourcename) {
return YmlInit.getMapByName(YmlInit.getYml(path), sourcename);
}
// 配置文件仅需要读取一次,读取配置文件的同时把数据保存到map中,map定义为final,仅可以被赋值一次
private static class YmlInit {
//初始化文件得到的map
// 读取配置文件,并初始化ymlMap
private static Map<String, Object> getYml(String path) {
Yaml yml = new Yaml();
Reader reader = null;
try {
reader = new FileReader(new File(path));
} catch (Exception e) {
// TODO: handle exception
e.printStackTrace();
}
return yml.loadAs(reader, Map.class);
}
// //传入想要得到的字段
private static Map<String, Object> getMapByName(Map<String, Object> map, String name) {
Map<String, Object> maps = new HashMap<String, Object>();
Set<Map.Entry<String, Object>> set = map.entrySet();
for (Map.Entry<String, Object> entry : set) {// 遍历map
Object obj = entry.getValue();
if (entry.getKey().equals(name)) // 递归结束条件
return (Map<String, Object>) obj;
if (entry.getValue() instanceof Map) {//如果value是Map集合递归
maps = getMapByName((Map<String, Object>) obj, name);
if (maps == null) //递归的结果如果为空,继续遍历
continue;
return maps; //不为空返回
}
}
return null;
}
}
}
package com.zork.common.utils;
import org.yaml.snakeyaml.Yaml;
import java.io.*;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
/**
* @Author: Prock.Liy
* @Date: 2021/8/11
* @Description:
*/
public class YmlUtil {
/**
* 获取yml文件中的指定字段,返回一个map
*
* @param sourcename
* @return
*/
public static Map<String, Object> getResMap(String sourcename) {
return YmlInit.getMapByName(YmlInit.ymlMap, sourcename);
}
// 配置文件仅需要读取一次,读取配置文件的同时把数据保存到map中,map定义为final,仅可以被赋值一次
private static class YmlInit {
//初始化文件得到的map
private static final Map<String, Object> ymlMap = getYml();
// 读取配置文件,并初始化ymlMap
private static Map<String, Object> getYml() {
Yaml yml = new Yaml();
String path = Object.class.getResource("/").getPath().substring(1) + "application.yml";
Reader reader = null;
try {
reader = new FileReader(new File(path));
} catch (Exception e) {
// TODO: handle exception
e.printStackTrace();
}
return yml.loadAs(reader, Map.class);
}
// //传入想要得到的字段
private static Map<String, Object> getMapByName(Map<String, Object> map, String name) {
Map<String, Object> maps = new HashMap<String, Object>();
Set<Map.Entry<String, Object>> set = map.entrySet();
for (Map.Entry<String, Object> entry : set) {// 遍历map
Object obj = entry.getValue();
if (entry.getKey().equals(name)) // 递归结束条件
return (Map<String, Object>) obj;
if (entry.getValue() instanceof Map) {//如果value是Map集合递归
maps = getMapByName((Map<String, Object>) obj, name);
if (maps == null) //递归的结果如果为空,继续遍历
continue;
return maps; //不为空返回
}
}
return null;
}
}
public static void main(String[] args) throws Exception {
// TODO Auto-generated method stub
Yaml yml = new Yaml();
//配置文件路径
String path = Object.class.getResource("/").getPath().substring(1)+ "application-kafka.yml";
InputStream reader = new FileInputStream(new File(path));
//yml读取配置文件,指定返回类型为Map,Map中value类型为LinkedHashMap
Map map = yml.loadAs(reader, Map.class);
/**
* eg:获取server中的port
* server:
port: 8090
context-path: /myService
*/
Map mapServer = (Map)map.get("spring");
String port = mapServer.get("kafka.bootstrap-servers").toString();
System.out.println(port);//输出8090
/**
* 但是如果格式是这样的,或者有更深层次的,我们想动态获取datasource的map集合呢?
* 我们可以写一个方法,使用递归动态获取map
spring:
datasource:
url: jdbc:mysql://localhost:3306/bc
username: root
password: 123456
driver-class-name: com.mysql.jdbc.Driver
*/
//传入想要得到的字段
// Map datasourceMap = initYml(map,"datasource");
// System.out.println(datasourceMap.get("url"));//jdbc:mysql://localhost:3306/bc
// System.out.println(datasourceMap.get("username"));//root
// System.out.println(datasourceMap.get("password"));//123456
// System.out.println(datasourceMap.get("driver-class-name"));//com.mysql.jdbc.Driver
}
}
package com.zork.common.utils;
import com.alibaba.fastjson.JSONObject;
import com.baomidou.mybatisplus.core.metadata.IPage;
import lombok.extern.slf4j.Slf4j;
import net.sourceforge.pinyin4j.PinyinHelper;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;
import java.net.Inet4Address;
import java.net.InetAddress;
import java.net.NetworkInterface;
import java.net.SocketException;
import java.util.ArrayList;
import java.util.Enumeration;
import java.util.HashMap;
import java.util.Map;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
/**
* @Author Prock.Liy
* @Date 2020/10/10 13:55
* @Description
**/
@Slf4j
public class ZorkUtil {
/**
* 得到中文首字母(包括字符串中字母)
*
* @param str
* @return
*/
public static String getPinYinHeadChar(String str) {
String convert = "";
for (int j = 0; j < str.length(); j++) {
char word = str.charAt(j);
String[] pinyinArray = PinyinHelper.toHanyuPinyinStringArray(word);
if (pinyinArray != null) {
convert += pinyinArray[0].charAt(0);
} else {
convert += word;
}
}
return convert;
}
/**
* 新方式,遍历所有本地网络接口,找出自己想要的ip
* @return
*/
public static String getLocalIp() {
try {
Enumeration<NetworkInterface> networkInterfaces = NetworkInterface.getNetworkInterfaces();
ArrayList<String > list = new ArrayList<>();
while (networkInterfaces.hasMoreElements()) {
NetworkInterface networkInterface = networkInterfaces.nextElement();
//isUp判断是否是启动状态
//isVirtual 判断是否是虚拟Ip
//isLoopback 判断是否是子网络接口
if (networkInterface.isUp()&&!networkInterface.isVirtual()&&!networkInterface.isLoopback()) {
//内部网络地址
Enumeration<InetAddress> inetAddresses = networkInterface.getInetAddresses();
while (inetAddresses.hasMoreElements()) {
InetAddress inetAddress = inetAddresses.nextElement();
if (inetAddress instanceof Inet4Address) {
String hostAddress = inetAddress.getHostAddress();
String hostName = inetAddress.getCanonicalHostName();
list.add(hostAddress);
}
}
}
}
return list.get(0);
} catch (SocketException e) {
e.printStackTrace();
}
return null;
}
/**
* 设置响应
*
* @param response HttpServletResponse
* @param contentType content-type
* @param status http状态码
* @param value 响应内容
* @throws IOException IOException
*/
public static void makeResponse(HttpServletResponse response, String contentType,
int status, Object value) throws IOException {
response.setContentType(contentType);
response.setStatus(status);
response.getOutputStream().write(JSONObject.toJSONString(value).getBytes());
}
/**
* 正则校验
*
* @param regex 正则表达式字符串
* @param value 要匹配的字符串
* @return 正则校验结果
*/
public static boolean match(String regex, String value) {
Pattern pattern = Pattern.compile(regex);
Matcher matcher = pattern.matcher(value);
return matcher.matches();
}
/**
* 封装前端分页表格所需数据
*
* @param pageInfo pageInfo
* @return Map<String, Object>
*/
public static Map<String, Object> getDataTable(IPage<?> pageInfo) {
Map<String, Object> data = new HashMap<>();
data.put("rows", pageInfo.getRecords());
data.put("total", pageInfo.getTotal());
return data;
}
}
version=2.2-RELEASE
groupId=com.zork
artifactId=zork-common
com/zork/common/utils/YamlUtil.class
com/zork/common/utils/RestTemplateUtil.class
com/zork/common/dto/NormalFieldsDTO.class
com/zork/common/dto/Dimension.class
com/zork/common/dto/KeyValueDTO.class
com/zork/common/dto/MetricDTO$MetricDTOBuilder.class
com/zork/common/utils/YamlUtil$YmlInit.class
com/zork/common/utils/SqlFileExecuteUtil.class
com/zork/common/utils/YmlUtil$YmlInit.class
com/zork/common/utils/ZorkUtil.class
com/zork/common/dto/Dimension$DimensionBuilder.class
com/zork/common/dto/ExecuteResultDTO.class
com/zork/common/utils/ListUtil.class
com/zork/common/utils/YmlUtil.class
com/zork/common/handler/ThrowingConsumer.class
com/zork/common/constant/InfluxDBConstant.class
com/zork/common/utils/DateUtil.class
com/zork/common/dto/NormalFieldsDTO$NormalFieldsDTOBuilder.class
com/zork/common/utils/FileUtil$MyExtFilter.class
com/zork/common/utils/FileUtil.class
com/zork/common/service/RedisService.class
com/zork/common/handler/ConsumerWrapperHandler.class
com/zork/common/configure/ZorkLettuceRedisConfigure.class
com/zork/common/constant/DateConstant.class
com/zork/common/CommonApplication.class
com/zork/common/service/InfluxDBService.class
com/zork/common/utils/StringUtil.class
com/zork/common/utils/ScriptRunner.class
com/zork/common/exception/ZorkException.class
com/zork/common/entity/QueryRequest.class
com/zork/common/entity/ZorkResponse.class
com/zork/common/constant/DimensionConstant.class
com/zork/common/handler/BaseExceptionHandler.class
com/zork/common/dto/MetricDTO.class
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/constant/DateConstant.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/service/RedisService.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/utils/YamlUtil.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/dto/ExecuteResultDTO.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/utils/SqlFileExecuteUtil.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/exception/ZorkException.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/configure/ZorkLettuceRedisConfigure.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/CommonApplication.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/service/InfluxDBService.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/utils/ZorkUtil.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/entity/ZorkResponse.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/dto/NormalFieldsDTO.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/handler/ThrowingConsumer.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/utils/RestTemplateUtil.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/utils/ListUtil.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/utils/YmlUtil.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/utils/FileUtil.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/utils/DateUtil.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/constant/DimensionConstant.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/constant/InfluxDBConstant.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/utils/StringUtil.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/entity/QueryRequest.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/utils/ScriptRunner.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/handler/ConsumerWrapperHandler.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/dto/Dimension.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/dto/KeyValueDTO.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/dto/MetricDTO.java
/Users/xuli/Documents/ZorkCodes/applets/zork-common/src/main/java/com/zork/common/handler/BaseExceptionHandler.java
This diff is collapsed.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<artifactId>zork-applets</artifactId>
<groupId>com.zork</groupId>
<version>2.2-RELEASE</version>
<relativePath>../zork-applets/pom.xml</relativePath>
</parent>
<artifactId>zork-jar</artifactId>
<packaging>pom</packaging>
<name>Zork-Jar</name>
<description>Zork-Jar 静态jar可执行文件</description>
<modules>
<module>zork-jar-htpocesb</module>
<module>zork-jar-send-kafka</module>
<!-- <module>zork-jar-sendkafka</module>-->
</modules>
<dependencies>
<dependency>
<groupId>com.zork</groupId>
<artifactId>zork-common</artifactId>
<version>2.2-RELEASE</version>
</dependency>
<!-- mybatis-->
<dependency>
<groupId>com.baomidou</groupId>
<artifactId>dynamic-datasource-spring-boot-starter</artifactId>
<version>2.5.7</version>
</dependency>
<!-- p6sy用于在控制台中打印MyBatis执行的SQL-->
<dependency>
<groupId>p6spy</groupId>
<artifactId>p6spy</artifactId>
<version>3.8.1</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
<!-- SBA-->
<dependency>
<groupId>de.codecentric</groupId>
<artifactId>spring-boot-admin-starter-client</artifactId>
<version>2.1.6</version>
</dependency>
<!-- sleuth依赖-->
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-sleuth</artifactId>
</dependency>
<!-- Zipkin链路追踪 -->
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-zipkin</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.amqp</groupId>
<artifactId>spring-rabbit</artifactId>
</dependency>
<!-- ELK-->
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>6.1</version>
</dependency>
</dependencies>
</project>
\ No newline at end of file
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<artifactId>zork-jar</artifactId>
<groupId>com.zork</groupId>
<version>2.2-RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>zork-jar-htpocesb</artifactId>
<name>Zork-Jar-Htpocesb</name>
<description>HtPocESB: 解析csv文件或文件夹;根据匹配规则进行匹对入库(InfluxDB)</description>
<dependencies>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>3.17</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
\ No newline at end of file
package com.zork.htpocesb;
import com.zork.htpocesb.constant.DataConstant;
import com.zork.common.service.InfluxDBService;
import com.zork.htpocesb.thread.CompletableFutureAsync;
import com.zork.common.utils.FileUtil;
import com.zork.common.utils.DateUtil;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang.StringUtils;
import java.io.File;
import java.util.Arrays;
import static com.zork.htpocesb.component.CsvWriteInfluxDBComponent.invoke;
@Slf4j
public class HtpocesbApplication {
/**
* 程序入口
*
* @param args 传入文件路径
*/
public static void main(String[] args) {
if (args.length != 5) {
log.error("启动参数错误:{},请输入正确参数格式!",
args);
log.error("参考如下: {}",
"【java -jar htpocesb-0.0.1-SNAPSHOT.jar username password url database csvFile】");
return;
}
InfluxDBService influxDBConnect = new InfluxDBService(args[0], args[1], args[2], args[3]);
File file = new File(args[4]);
if (file.isDirectory()) {
// 读取文件夹下csv类型文件名称
String[] csvFile = (new File(args[4])).list(new FileUtil.MyExtFilter(DataConstant.CSV));
assert csvFile != null;
// CompletableFutureAsync执行任务
CompletableFutureAsync.execute(influxDBConnect, args[4], Arrays.asList(csvFile));
return;
}
if (!file.exists()) {
log.error("该文件:{},不存在!", args[4]);
return;
}
if (file.isFile()) {
String fileSuffix = StringUtils.substringAfter(args[4], ".");
if (!DataConstant.CSV.equals(fileSuffix)) {
log.error("请输入正确的csv文件路径!");
return;
}
log.info("Start to read Csv file data, file path: 【\033[31m{}\033[0m】", args[4]);
long startTime = System.currentTimeMillis();
invoke(influxDBConnect, args[4]);
log.info("Task execution success, time-consuming:【\033[31m{}\033[0m】", DateUtil.timeConsuming(startTime, System.currentTimeMillis()));
}
}
}
package com.zork.htpocesb.component;
import com.zork.common.constant.DateConstant;
import com.zork.htpocesb.constant.DataConstant;
import com.zork.common.dto.KeyValueDTO;
import com.zork.common.service.InfluxDBService;
import com.zork.common.utils.FileUtil;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang.StringUtils;
import org.influxdb.dto.Point;
import java.util.Arrays;
import java.util.LinkedList;
import java.util.List;
import java.util.concurrent.TimeUnit;
import static com.zork.common.utils.DateUtil.dateToStamp;
/**
* @Author: Prock.Liy
* @Date: 2021/7/21
* @Description:
*/
@Slf4j
public class CsvWriteInfluxDBComponent {
/**
* 解析CSV数据,进行匹对后写入InfluxDB
*
* @param influxDBConnect InfluxDB
* @param csvFilePath csvFilePath
*/
public static void invoke(InfluxDBService influxDBConnect, String csvFilePath) {
try {
int throughputCount = 0;
int businessCount = 0;
int esbCount = 0;
influxDBConnect.influxDbBuild();
influxDBConnect.createRetentionPolicy(DataConstant.AUTOGEN);
// 解析csv文件数据为Map格式
KeyValueDTO keyValueDTO = FileUtil.readCsv(csvFilePath);
List<Point> pointList = new LinkedList<>();
for (String valueArray : keyValueDTO.getValueList()) {
// 时间格式转换
long timestamp = dateToStamp(StringUtils.substringBefore(valueArray, ";"), DateConstant.SS_XXX_TIME_08);
// 获取维度值
List<String> valueList = Arrays.asList(StringUtils.substringAfter(valueArray, ";").split(";"));
int count = 0;
if (keyValueDTO.getKeyList().size() == valueList.size()) {
for (String value : valueList) {
if (DataConstant.NULL.equals(value)) {
count++;
continue;
}
// 赋值fields
String key = keyValueDTO.getKeyList().get(count).trim();
List<String> keyValueList = Arrays.asList(key.split(" "));
switch (keyValueList.size()) {
case 1:
pointList.add(
Point.measurement(DataConstant.THROUGHPUT)
.time(timestamp, TimeUnit.MILLISECONDS)
.addField(DataConstant.THROUGHPUT_METRIC, Double.valueOf(value))
.tag(DataConstant.IP, keyValueList.get(0))
.build()
);
throughputCount++;
break;
case 2:
pointList.add(
Point.measurement(DataConstant.ESB_SYSTEM)
.time(timestamp, TimeUnit.MILLISECONDS)
.addField(DataConstant.ESB_SYSTEM_METRIC, Double.valueOf(value))
.tag(DataConstant.IP, keyValueList.get(0))
.tag(DataConstant.SYSTEM_ID, keyValueList.get(1))
.build()
);
esbCount++;
break;
case 3:
pointList.add(
Point.measurement(DataConstant.BUSINESS)
.time(timestamp, TimeUnit.MILLISECONDS)
.addField(DataConstant.FUNCTION_METRIC, Double.valueOf(value))
.tag(DataConstant.IP, keyValueList.get(0))
.tag(DataConstant.SYSTEM_ID, keyValueList.get(1))
.tag(DataConstant.FUNC_ID, keyValueList.get(2))
.build()
);
businessCount++;
break;
}
count++;
}
}
}
influxDBConnect.insertBatch(pointList, 100000);
// if (throughputCount != 0) {
// log.info("Throughput import Success, number:{}", throughputCount);
// }
// if (esbCount != 0) {
// log.info("Esb system time consuming import Success, number:{}", esbCount);
// }
// if (businessCount != 0) {
// log.info("Business system time consuming import Success, number:{}", businessCount);
// }
} catch (Exception e) {
e.printStackTrace();
log.error("数据匹对出错,本次导入InfluxDB任务退出!,Error:【\033[31m{}\033[0m】", e.getMessage());
}
}
}
package com.zork.htpocesb.component;
import com.alibaba.fastjson.JSONArray;
import com.alibaba.fastjson.JSONObject;
import org.apache.commons.lang.StringUtils;
import java.io.*;
import java.util.Arrays;
import java.util.Scanner;
//import static com.zork.htpocesb.file.CallableJsonExcl.excelWrite;
/**
* @Author: Prock.Liy
* @Date: 2021/9/10
* @Description:
*/
public class JsonExcelPaper {
/**
* 本地json文件路径
*/
public static final String JSON_FILE = "C:\\Users\\Prock.Liy\\Desktop\\smple\\Sample-united_states.json";
/**
* 导出excel路径
*/
public static final String EXCEL_PATH = "E://phone.xls";
// /**
// * 读取本地json数组文件解析phone导出至excel
// *
// * @param args
// * @throws FileNotFoundException
// * @throws IOException
// */
// public static void main(String[] args) throws FileNotFoundException, IOException {
//
// File fileList = new File("C:\\Users\\Prock.Liy\\Desktop\\smple\\files");
// File[] files = fileList.listFiles();
// //foreach遍历数组
// Arrays.stream(files).forEach(file -> {
// long start = System.currentTimeMillis();
// analyzeJsonFile(file.getPath(),file.getName());
// long end = System.currentTimeMillis();
// System.out.println(file.getName()+" 执行了" + (end - start) + "ms");
// });
//
// }
//
// public static void analyzeJsonFile(String filePath,String fileName){
// FileInputStream inputStream = null;
// Scanner sc = null;
// try {
// File jsonFile = new File(filePath);
// // Construct BufferedReader from FileReader
// BufferedReader reader = new BufferedReader(new FileReader(jsonFile));
// JSONArray jsonArray = new JSONArray();
//
// inputStream = new FileInputStream(filePath);
// sc = new Scanner(inputStream, "UTF-8");
// while (sc.hasNextLine()) {
// String line = sc.nextLine();
// if (StringUtils.isEmpty(line)) {
// continue;
// }
// try {
// jsonArray.add(JSONObject.parseObject(line));
// }catch (Exception e){
// }
// }
// reader.close();
// if (jsonArray.size() == 0) {
// return;
// }
// if (jsonArray.size() > 0) {
// System.out.println("开始解析JsonArray数据:------------" + fileName);
// excelWrite(fileName, jsonArray);
// System.out.println("解析JsonArray数据:------------" + fileName);
// }
// } catch (IOException e) {
// // TODO Auto-generated catch block
// e.printStackTrace();
// }
// }
}
package com.zork.htpocesb.constant;
/**
* @Author: Prock.Liy
* @Date: 2021/8/6
* @Description:
*/
public class DataConstant {
public static final String NULL = "null";
public static final String IP = "ip";
public static final String SYSTEM_ID = "systemid";
public static final String FUNC_ID = "funcid";
public static final String THROUGHPUT = "throughput";
public static final String THROUGHPUT_METRIC = "througthoutmetric";
public static final String BUSINESS = "business_system_time_consuming";
public static final String FUNCTION_METRIC = "functionmetric";
public static final String ESB_SYSTEM = "esb_system_time_consuming";
public static final String ESB_SYSTEM_METRIC = "esbsystemmetric";
public static final String CSV = "csv";
public static final String AUTOGEN = "autogen";
}
package com.zork.htpocesb.file;
import com.alibaba.fastjson.JSONArray;
import com.alibaba.fastjson.JSONObject;
import com.alibaba.fastjson.serializer.SerializerFeature;
import org.apache.commons.lang.StringUtils;
import org.apache.poi.hssf.usermodel.*;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.text.SimpleDateFormat;
import java.util.*;
import java.util.concurrent.*;
/**
* @Author: Prock.Liy
* @Date: 2021/9/13
* @Description:
*/
public class CallableJsonExcl {
// final static SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
//
// public static void main(String[] args) {
// File f = new File("C:\\Users\\Prock.Liy\\Desktop\\smple\\files");
// // 文件总数
// final List<File> filePathsList = new ArrayList<File>();
// File[] filePaths = f.listFiles();
// for (File s : filePaths) {
// filePathsList.add(s);
// }
//
// CountDownLatch latch = new CountDownLatch(filePathsList.size());
// ExecutorService pool = Executors.newCachedThreadPool();
//
// BlockingQueue<Future<Map<String, FileInputStream>>> queue =
// new ArrayBlockingQueue<Future<Map<String, FileInputStream>>>(150);
//
// System.out.println("-------------文件读、写任务开始时间:" + sdf.format(new Date()));
// for (int i = 0; i < filePathsList.size(); i++) {
// File temp = filePathsList.get(i);
// Future<Map<String, FileInputStream>> future = pool.submit(new MyCallableProducer(latch, temp));
// queue.add(future);
// }
//
// try {
// latch.await();
// } catch (InterruptedException e) {
// e.printStackTrace();
// }
// System.out.println("-------------文件读、写任务结束时间:" + sdf.format(new Date()));
// pool.shutdownNow();
// }
//
//
// // 文件读线程
// static class MyCallableProducer implements Callable<Map<String, FileInputStream>> {
// private CountDownLatch latch;
// private File file;
// private FileInputStream fis = null;
// private Map<String, FileInputStream> fileMap = new HashMap<String, FileInputStream>();
//
// public MyCallableProducer(CountDownLatch latch, File file) {
// this.latch = latch;
// this.file = file;
// }
//
// @Override
// public Map<String, FileInputStream> call() throws Exception {
// System.out.println(Thread.currentThread().getName() + " 线程开始读取文件 :" + file.getName() + " ,时间为 " + sdf.format(new Date()));
// fis = new FileInputStream(file);
// fileMap.put(file.getName(), fis);
// JSONArray jsonArray = doWork(file.getName(), fis);
// System.out.println("数据大小: " + jsonArray.size());
//
//
// System.out.println(Thread.currentThread().getName() + " 线程读取文件 :" + file.getName() + " 完毕" + " ,时间为 " + sdf.format(new Date()));
// latch.countDown();
// return fileMap;
// }
//
// private JSONArray doWork(String fileName, FileInputStream inputStream) {
// Scanner sc = null;
// JSONArray jsonArray = new JSONArray();
// sc = new Scanner(inputStream, "UTF-8");
// while (sc.hasNextLine()) {
// String line = sc.nextLine();
// if (StringUtils.isEmpty(line)) {
// continue;
// }
// jsonArray.add(JSONObject.parseObject(line));
// }
//
// if (jsonArray.size() > 0) {
// System.out.println("开始解析JsonArray数据:------------" + fileName);
// excelWrite(fileName, jsonArray);
// System.out.println("解析JsonArray数据:------------" + fileName);
// }
// //此方法可以添加一些业务逻辑,比如包装pojo等等操作,返回的值可以是任何类型
// Random rand = new Random();
// int time = rand.nextInt(10) * 1000;
// try {
// Thread.sleep(time);
// } catch (InterruptedException e) {
// e.printStackTrace();
// }
// return jsonArray;
// }
//
//
//
// }
//
// public static boolean excelWrite(String filePath, JSONArray jsonArray) {
// try {
// // 标题
// String title = "Phone";
// // 创建一个工作簿
// HSSFWorkbook workbook = new HSSFWorkbook();
// // 创建一个工作表sheet
// HSSFSheet sheet = workbook.createSheet();
// // 设置列宽
// sheet.setColumnWidth(0, 5000);
// // 创建第一行
// HSSFRow row = sheet.createRow(0);
// // 创建一个单元格
// HSSFCell cell = null;
// // 创建表头
//
// cell = row.createCell(0);
// // 设置样式
// HSSFCellStyle cellStyle = workbook.createCellStyle();
// // 设置字体
// HSSFFont font = workbook.createFont();
// font.setFontName("宋体");
// // font.setFontHeight((short)12);
// font.setFontHeightInPoints((short) 13);
// cellStyle.setFont(font);
// cell.setCellStyle(cellStyle);
// cell.setCellValue(title);
//
// // 模拟数据
// List<String> list = JSONObject.parseArray(jsonArray.toJSONString(), String.class);
//
// // 从第二行开始追加数据
// for (int i = 1; i < (list.size() + 1); i++) {
// // 创建第i行
// HSSFRow nextRow = sheet.createRow(i);
// for (int j = 0; j < 8; j++) {
// JSONObject jsonObject = JSONObject.parseObject(list.get(i - 1));
// HSSFCell cell2 = nextRow.createCell(j);
// if (j == 0) {
// JSONArray dataArray = jsonObject.getJSONArray("phone_numbers");
// if (dataArray.size() == 0) {
// continue;
// }
// String phone = dataArray.toString(SerializerFeature.BeanToArray);
// cell2.setCellValue(
// // 替换["+"]符号
// phone.replaceAll("\"", "")
// .replaceAll("\\+", "")
// .replaceAll("\\[", "")
// .replaceAll("\\]", "")
// );
// }
// }
// }
//
// // 导出excel文件路径
// workbook.write(new FileOutputStream(new File(
// "C:\\Users\\Prock.Liy\\Desktop\\smple\\excel\\" +
// StringUtils.substringBefore(filePath, "."))+".xls"));
// // 关闭工作薄
// workbook.close();
// return true;
// } catch (Exception e) {
// e.printStackTrace();
// return false;
// }
// }
}
package com.zork.htpocesb.thread;
import com.zork.common.service.InfluxDBService;
import com.zork.common.utils.DateUtil;
import lombok.extern.slf4j.Slf4j;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.function.BiConsumer;
import java.util.function.Function;
import static com.zork.htpocesb.component.CsvWriteInfluxDBComponent.invoke;
/**
* @Author: Prock.Liy
* @Date: 2021/7/26
* @Description:
*/
@Slf4j
public class CompletableFutureAsync {
/**
* 此executor线程池如果不传,CompletableFuture经测试默认只启用最多3个线程,所以最好自己指定线程数量
* <p>
* 默认线程池定义,(15个)
*/
protected static ExecutorService EXECUTOR_POOL = Executors.newFixedThreadPool(
15
);
/**
* Java8 CompletableFuture执行任务
*
* @param influxDBConnect influxDB链接
* @param fileNameList 文件名
*/
public static void execute(InfluxDBService influxDBConnect, String filePath, List<String> fileNameList) {
long startTime = System.currentTimeMillis();
CompletableFuture<Void> allFutures = CompletableFuture.allOf(
fileNameList.stream()
.map(fileName -> handle(influxDBConnect, filePath, fileName))
.toArray(CompletableFuture[]::new)
);
allFutures.join();
allFutures.whenComplete(new BiConsumer<Void, Throwable>() {
@Override
public void accept(Void aVoid, Throwable throwable) {
if (allFutures.isDone()) {
log.info("Task execution success, time-consuming:【\033[31m{}\033[0m】", DateUtil.timeConsuming(startTime, System.currentTimeMillis()));
}
EXECUTOR_POOL.shutdown();
}
});
}
/**
* 线程并行执行
*
* @param influxDBConnect
* @param filePath
* @param fileName
* @return
*/
public static CompletableFuture<Void> handle(InfluxDBService influxDBConnect, String filePath, String fileName) {
return CompletableFuture.runAsync(() -> {
String csvFilePath = String.format("%s\\%s", filePath, fileName);
log.info("Start to read Csv file data, file path: 【\033[31m{}\033[0m】", csvFilePath);
invoke(influxDBConnect, csvFilePath);
try {
Thread.sleep(3000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}, EXECUTOR_POOL).exceptionally(new Function<Throwable, Void>() { //捕捉异常,不会导致整个流程中断
@Override
public Void apply(Throwable throwable) {
log.info("线程[{}]发生了异常, 继续执行其他线程,错误详情[{}]", Thread.currentThread().getName(), throwable.getMessage());
return null;
}
});
}
}
This diff is collapsed.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<artifactId>zork-jar</artifactId>
<groupId>com.zork</groupId>
<version>2.2-RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>zork-jar-send-kafka</artifactId>
<name>Zork-Jar-Send-Kafka</name>
<description>往Kafka推送信息</description>
<dependencies>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.dataformat/jackson-dataformat-yaml -->
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-yaml</artifactId>
<version>2.9.5</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
\ No newline at end of file
package com.zork.send.kafka;
import com.alibaba.fastjson.JSON;
import com.zork.send.kafka.kafka.Producer;
import com.zork.send.kafka.model.KafkaConfig;
import lombok.extern.slf4j.Slf4j;
import java.util.List;
import java.util.Map;
import static com.zork.common.utils.YamlUtil.getResMap;
import static com.zork.send.kafka.component.SendKafkaComponent.seekData;
@Slf4j
public class SendKafkaApplication {
/**
* 获取kafka配置文件信息
*
* @param filePath 配置文件路径
* @return
*/
public KafkaConfig seekKafkaYml(String filePath) {
// Map<String, Object> map = new Yaml().load(this.getClass()
// .getClassLoader()
// .getResourceAsStream("application-kafka.yml"));
Map<String, Object> map = getResMap(filePath, "kafka");
return JSON.parseObject(JSON.toJSONString(map), KafkaConfig.class);
}
/**
* 程序入口
*
* @param args 传入文件路径
*/
public static void main(String[] args) {
if (args.length != 1) {
log.error("启动参数错误:{},请输入正确参数格式!", args);
log.error("参考如下: {}", "【java -jar **.jar /opt/app/application-kafka.yml】");
return;
}
// 获取yaml配置信息
KafkaConfig kafkaConfig = new SendKafkaApplication().seekKafkaYml(args[0]);
// 创建字符串信息
List<String> valueList = seekData(kafkaConfig.getSendSize(), kafkaConfig.getMessageArraySize());
// sendKafka
new Producer(valueList, kafkaConfig).start();
System.out.println(kafkaConfig.getServers().getBytes().length);
}
}
package com.zork.send.kafka.component;
import com.zork.send.kafka.model.KafkaConfig;
import lombok.extern.slf4j.Slf4j;
import org.apache.kafka.clients.producer.KafkaProducer;
import java.util.ArrayList;
import java.util.List;
/**
* @Author: Prock.Liy
* @Date: 2021/10/14
* @Description:
*/
@Slf4j
public class SendKafkaComponent {
// 配置文件
private static KafkaProducer<Integer, String> producer;
public static void invoke(String value,KafkaConfig kafkaConfig) {
}
public static List<String> seekData(Integer sendSize,Integer messageArraySize) {
List<String> valueList = new ArrayList<>();
for (int k =0;k<messageArraySize;k++){
StringBuilder builder = new StringBuilder();
for (int i = 0; i < (sendSize * 1024) * 1024; i++) {
builder.append("a");
}
valueList.add(builder.toString());
}
return valueList;
}
}
package com.zork.send.kafka.kafka;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.common.serialization.IntegerDeserializer;
import org.apache.kafka.common.serialization.StringDeserializer;
import java.time.Duration;
import java.util.Collections;
import java.util.Properties;
/**
* @Author: Prock.Liy
* @Date: 2021/8/10
* @Description:
*/
public class Consumer extends Thread{
KafkaConsumer<Integer,String> consumer;
String topic;
public Consumer(String topic){
Properties properties=new Properties();
properties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,"192.168.33.11:9092");
properties.put(ConsumerConfig.CLIENT_ID_CONFIG,"message");
properties.put(ConsumerConfig.GROUP_ID_CONFIG,"message120211015");
properties.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG,"30000");
properties.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG,"1000"); //自动提交(批量确认)
properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, IntegerDeserializer.class.getName());
properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
// 一个新的group的消费者去消费一个topic
properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG,"earliest"); //这个属性. 它能够消费昨天发布的数据
consumer=new KafkaConsumer<Integer, String>(properties);
this.topic = topic;
}
@Override
public void run() {
consumer.subscribe(Collections.singleton(this.topic));
while (true){
ConsumerRecords<Integer,String> consumerRecords = consumer.poll(Duration.ofSeconds(1));
consumerRecords.forEach(record ->{
System.out.println(record.key()+"->"+record.value()+"->"+record.offset()); });
}
}
}
package com.zork.send.kafka.kafka;
import com.zork.send.kafka.model.KafkaConfig;
import lombok.extern.slf4j.Slf4j;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.serialization.IntegerSerializer;
import org.apache.kafka.common.serialization.StringSerializer;
import java.util.List;
import java.util.Properties;
import java.util.concurrent.TimeUnit;
/**
* @Author: Prock.Liy
* @Date: 2021/8/10
* @Description:
*/
@Slf4j
public class Producer extends Thread {
private final KafkaProducer<Integer, String> producer;
private final KafkaConfig kafkaConfig;
private final List<String> messageList;
public Producer(List<String> messageList, KafkaConfig kafkaConfig) {
Properties properties = new Properties();
properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaConfig.getServers());
properties.put(ProducerConfig.CLIENT_ID_CONFIG, "producer");
properties.put("auto.commit.interval.ms", "1000");
properties.put("session.timeout.ms", "30000");
properties.put("max.request.size", "104857600");
properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class.getName());
properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
producer = new KafkaProducer<Integer, String>(properties);
this.messageList = messageList;
this.kafkaConfig = kafkaConfig;
}
@Override
public void run() {
if (kafkaConfig.getMessageArraySize() != messageList.size()) {
log.error("生成message信息组错误,message大小: {}; 配置文件发送信息次数为: {}", messageList.size(), kafkaConfig.getMessageArraySize());
}
messageList.forEach(message -> {
try {
producer.send(new ProducerRecord<>(kafkaConfig.getTopic(), 3, message), ((recordMetadata, e) -> {
System.out.println(recordMetadata.offset() + "->" + recordMetadata.partition());
log.info("【本次发送消息至Kafka ,{} Size大小: {}】", message.getBytes().length, message.getBytes().length / (1024 * 1024) + "M");
}));
TimeUnit.SECONDS.sleep(2);
} catch (InterruptedException e) {
e.printStackTrace();
}
});
}
}
package com.zork.send.kafka.model;
import lombok.Data;
/**
* @Author: Prock.Liy
* @Date: 2021/10/14
* @Description:
*/
public class KafkaConfig {
private String servers;
private String topic;
private String groupId;
private Integer sendSize;
private Integer messageArraySize;
public Integer getMessageArraySize() {
return messageArraySize;
}
public KafkaConfig setMessageArraySize(Integer messageArraySize) {
this.messageArraySize = messageArraySize;
return this;
}
public String getServers() {
return servers;
}
public KafkaConfig setServers(String servers) {
this.servers = servers;
return this;
}
public String getTopic() {
return topic;
}
public KafkaConfig setTopic(String topic) {
this.topic = topic;
return this;
}
public String getGroupId() {
return groupId;
}
public KafkaConfig setGroupId(String groupId) {
this.groupId = groupId;
return this;
}
public Integer getSendSize() {
return sendSize;
}
public KafkaConfig setSendSize(Integer sendSize) {
this.sendSize = sendSize;
return this;
}
}
#============================ Kafka配置 ===============================
servers: 192.168.33.11:9092
topic: message1
groupId: message120211015
# Kafka发送信息size大小
sendSize: 1
messageArraySize: 5
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment