0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

Windows10 + Docker + Redis Cluster + Spring Boot + Spring Data Redis(Lettuce) + etc

Last updated at Posted at 2021-04-05

環境

項目 バージョン
OS Windows 10
Redis-server 6.2.1
redis-cli 6.2.1
Docker Hub 3.1.0
Spring Boot 2.4.4

環境補足

以下記事で環境構築。
WindowsでDockerを用いてRedisCluster

Spring Project

Dockerfile

DockerでRedisと同じネットワークに参加させて、動かすため。

FROM adoptopenjdk/openjdk11:ubi
RUN mkdir /opt/app
COPY target/demo-0.0.1-SNAPSHOT.jar /opt/app/redis-cluster-spring.jar
CMD ["java","-jar", "/opt/app/redis-cluster-spring.jar"]

pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.4.4</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>
    <groupId>com.example</groupId>
    <artifactId>demo</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>demo</name>
    <description>Demo project for Spring Boot</description>
    <properties>
        <java.version>11</java.version>
    </properties>
    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-redis</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-security</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-thymeleaf</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.session</groupId>
            <artifactId>spring-session-data-redis</artifactId>
        </dependency>
        <dependency>
            <groupId>org.thymeleaf.extras</groupId>
            <artifactId>thymeleaf-extras-springsecurity5</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-actuator</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-devtools</artifactId>
            <scope>runtime</scope>
            <optional>true</optional>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-configuration-processor</artifactId>
            <optional>true</optional>
        </dependency>
        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
            <optional>true</optional>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.security</groupId>
            <artifactId>spring-security-test</artifactId>
            <scope>test</scope>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
                <version>2.4.4</version>
                <configuration>
                    <mainClass>com.example.demo.DemoApplication</mainClass>
                </configuration>
            </plugin>
        </plugins>
    </build>

</project>

application.yml

spring:
  session:
    store-type: redis
  redis:
    cluster:
      nodes:
#        - 172.18.0.2:7000
        - 172.18.0.3:7001
#        - 172.18.0.4:7002
        - 172.18.0.5:7003
        - 172.18.0.6:7004
#        - 172.18.0.7:7005
#        - 172.18.0.9:7002
      max-redirects: 5
    lettuce:
      cluster:
        refresh:
          adaptive: true
          period: 15
server:
  port: 8080
management:
  health:
    redis:
      enabled: false
logging:
  level.io.lettuce.core.cluster: trace

spring.session.store-type: redis

Sessionの保持方法をRedisへ変更

spring.redis.cluster.nodes:

マスタノード3つ有効化。
ただ、フェイルオーバー時はSlaveと入れ替わるので全部書いておいてコメントアウトしておく。

spring.redis.lettuce.cluster.refresh.*

こちらで読んで判断してください。
https://lettuce.io/core/release/reference/index.html#clientoptions.cluster-specific-options

management.health.redis.enabled: false

actuator の検証対象に含まれる場合、フェイルオーバー中はhealthを返さないかもしれないため。
手元で確認した限りは、true/falseかかわらず、瞬間的に500を返すケースがあった。

※ログ。アコーディオンになっています。
2021-04-05 10:01:03.022 ERROR 1 --- [nio-8080-exec-3] o.a.c.c.C.[Tomcat].[localhost]           : Exception Processing ErrorPage[errorCode=0, location=/error]

org.springframework.data.redis.RedisSystemException: Error in execution; nested exception is io.lettuce.core.RedisCommandExecutionException: CLUSTERDOWN The cluster is down
        at org.springframework.data.redis.connection.lettuce.LettuceExceptionConverter.convert(LettuceExceptionConverter.java:54) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.connection.lettuce.LettuceExceptionConverter.convert(LettuceExceptionConverter.java:52) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.connection.lettuce.LettuceExceptionConverter.convert(LettuceExceptionConverter.java:41) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.PassThroughExceptionTranslationStrategy.translate(PassThroughExceptionTranslationStrategy.java:44) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.FallbackExceptionTranslationStrategy.translate(FallbackExceptionTranslationStrategy.java:42) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.connection.lettuce.LettuceConnection.convertLettuceAccessException(LettuceConnection.java:274) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.connection.lettuce.LettuceHashCommands.convertLettuceAccessException(LettuceHashCommands.java:472) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.connection.lettuce.LettuceHashCommands.hGetAll(LettuceHashCommands.java:197) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.connection.DefaultedRedisConnection.hGetAll(DefaultedRedisConnection.java:1145) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.core.DefaultHashOperations.lambda$entries$13(DefaultHashOperations.java:245) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:222) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:189) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.core.AbstractOperations.execute(AbstractOperations.java:96) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.core.DefaultHashOperations.entries(DefaultHashOperations.java:245) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.data.redis.core.DefaultBoundHashOperations.entries(DefaultBoundHashOperations.java:183) ~[spring-data-redis-2.4.6.jar!/:2.4.6]
        at org.springframework.session.data.redis.RedisIndexedSessionRepository.getSession(RedisIndexedSessionRepository.java:440) ~[spring-session-data-redis-2.4.2.jar!/:2.4.2]
        at org.springframework.session.data.redis.RedisIndexedSessionRepository.findById(RedisIndexedSessionRepository.java:412) ~[spring-session-data-redis-2.4.2.jar!/:2.4.2]
        at org.springframework.session.data.redis.RedisIndexedSessionRepository.findById(RedisIndexedSessionRepository.java:249) ~[spring-session-data-redis-2.4.2.jar!/:2.4.2]
        at org.springframework.session.web.http.SessionRepositoryFilter$SessionRepositoryRequestWrapper.getRequestedSession(SessionRepositoryFilter.java:351) ~[spring-session-core-2.4.2.jar!/:2.4.2]
        at org.springframework.session.web.http.SessionRepositoryFilter$SessionRepositoryRequestWrapper.getSession(SessionRepositoryFilter.java:289) ~[spring-session-core-2.4.2.jar!/:2.4.2]
        at org.springframework.session.web.http.SessionRepositoryFilter$SessionRepositoryRequestWrapper.getSession(SessionRepositoryFilter.java:192) ~[spring-session-core-2.4.2.jar!/:2.4.2]
        at javax.servlet.http.HttpServletRequestWrapper.getSession(HttpServletRequestWrapper.java:244) ~[tomcat-embed-core-9.0.44.jar!/:4.0.FR]
        at org.springframework.security.web.savedrequest.HttpSessionRequestCache.getRequest(HttpSessionRequestCache.java:85) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.savedrequest.HttpSessionRequestCache.getMatchingRequest(HttpSessionRequestCache.java:100) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:61) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:336) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:103) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:89) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:336) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:103) ~[spring-web-5.3.5.jar!/:5.3.5]        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:336) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:103) ~[spring-web-5.3.5.jar!/:5.3.5]        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:336) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:80) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:336) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:103) ~[spring-web-5.3.5.jar!/:5.3.5]        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:336) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:211) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:183) ~[spring-security-web-5.4.5.jar!/:5.4.5]
        at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:358) ~[spring-web-5.3.5.jar!/:5.3.5]
        at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:271) ~[spring-web-5.3.5.jar!/:5.3.5]
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) ~[spring-web-5.3.5.jar!/:5.3.5]
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119) ~[spring-web-5.3.5.jar!/:5.3.5]        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:103) ~[spring-web-5.3.5.jar!/:5.3.5]        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.springframework.session.web.http.SessionRepositoryFilter.doFilterInternal(SessionRepositoryFilter.java:141) ~[spring-session-core-2.4.2.jar!/:2.4.2]
        at org.springframework.session.web.http.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:82) ~[spring-session-core-2.4.2.jar!/:2.4.2]
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:103) ~[spring-web-5.3.5.jar!/:5.3.5]        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:710) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:459) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:384) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:312) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.StandardHostValve.custom(StandardHostValve.java:398) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.StandardHostValve.status(StandardHostValve.java:257) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.StandardHostValve.throwable(StandardHostValve.java:352) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:177) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:357) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:374) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:893) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1707) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[na:na]
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[na:na]
        at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) ~[tomcat-embed-core-9.0.44.jar!/:9.0.44]        at java.base/java.lang.Thread.run(Thread.java:834) ~[na:na]
Caused by: io.lettuce.core.RedisCommandExecutionException: CLUSTERDOWN The cluster is down
        at io.lettuce.core.internal.ExceptionFactory.createExecutionException(ExceptionFactory.java:137) ~[lettuce-core-6.0.3.RELEASE.jar!/:6.0.3.RELEASE]
        at io.lettuce.core.internal.ExceptionFactory.createExecutionException(ExceptionFactory.java:110) ~[lettuce-core-6.0.3.RELEASE.jar!/:6.0.3.RELEASE]
        at io.lettuce.core.protocol.AsyncCommand.completeResult(AsyncCommand.java:120) ~[lettuce-core-6.0.3.RELEASE.jar!/:6.0.3.RELEASE]
        at io.lettuce.core.protocol.AsyncCommand.complete(AsyncCommand.java:111) ~[lettuce-core-6.0.3.RELEASE.jar!/:6.0.3.RELEASE]
        at io.lettuce.core.protocol.CommandWrapper.complete(CommandWrapper.java:63) ~[lettuce-core-6.0.3.RELEASE.jar!/:6.0.3.RELEASE]
        at io.lettuce.core.cluster.ClusterCommand.complete(ClusterCommand.java:65) ~[lettuce-core-6.0.3.RELEASE.jar!/:6.0.3.RELEASE]
        at io.lettuce.core.protocol.CommandWrapper.complete(CommandWrapper.java:63) ~[lettuce-core-6.0.3.RELEASE.jar!/:6.0.3.RELEASE]
        at io.lettuce.core.protocol.CommandHandler.complete(CommandHandler.java:737) ~[lettuce-core-6.0.3.RELEASE.jar!/:6.0.3.RELEASE]        at io.lettuce.core.protocol.CommandHandler.decode(CommandHandler.java:672) ~[lettuce-core-6.0.3.RELEASE.jar!/:6.0.3.RELEASE]
        at io.lettuce.core.protocol.CommandHandler.channelRead(CommandHandler.java:589) ~[lettuce-core-6.0.3.RELEASE.jar!/:6.0.3.RELEASE]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:719) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        ... 1 common frames omitted

続きは以下の出力の繰り返し。
こちらは繋げなくなったノードに再接続をループで繰り返す。これは正常な動作と思われる。
ノードを再起動後は出力されなくなる。

2021-04-05 10:01:07.002  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.2:7000]: connection timed out: /172.18.0.2:7000
2021-04-05 10:01:07.102  INFO 1 --- [xecutorLoop-1-1] i.l.core.protocol.ConnectionWatchdog     : Reconnecting, last destination was 172.18.0.2:7000
2021-04-05 10:01:17.039  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.2:7000]: connection timed out: /172.18.0.2:7000
2021-04-05 10:01:17.104  WARN 1 --- [ioEventLoop-4-2] i.l.core.protocol.ConnectionWatchdog     : Cannot reconnect to [172.18.0.2:7000]: connection timed out: /172.18.0.2:7000
2021-04-05 10:01:17.202  INFO 1 --- [xecutorLoop-1-1] i.l.core.protocol.ConnectionWatchdog     : Reconnecting, last destination was 172.18.0.2:7000
2021-04-05 10:01:27.057  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.2:7000]: connection timed out: /172.18.0.2:7000
2021-04-05 10:01:27.204  WARN 1 --- [ioEventLoop-4-2] i.l.core.protocol.ConnectionWatchdog     : Cannot reconnect to [172.18.0.2:7000]: connection timed out: /172.18.0.2:7000
2021-04-05 10:01:27.302  INFO 1 --- [xecutorLoop-1-1] i.l.core.protocol.ConnectionWatchdog     : Reconnecting, last destination was 172.18.0.2:7000
2021-04-05 10:01:37.078  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.2:7000]: connection timed out: /172.18.0.2:7000
2021-04-05 10:01:37.312  WARN 1 --- [ioEventLoop-4-2] i.l.core.protocol.ConnectionWatchdog     : Cannot reconnect to [172.18.0.2:7000]: connection timed out: /172.18.0.2:7000
2021-04-05 10:01:37.502  INFO 1 --- [xecutorLoop-1-2] i.l.core.protocol.ConnectionWatchdog     : Reconnecting, last destination was 172.18.0.2:7000
2021-04-05 10:01:41.686  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.2:7000]: No route to host: /172.18.0.2:7000
2021-04-05 10:01:41.688  WARN 1 --- [ioEventLoop-4-2] i.l.core.protocol.ConnectionWatchdog     : Cannot reconnect to [172.18.0.2:7000]: No route to host: /172.18.0.2:7000

io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: /172.18.0.2:7000
Caused by: java.net.NoRouteToHostException: No route to host
        at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
        at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:779) ~[na:na]
        at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:330) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:707) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at java.base/java.lang.Thread.run(Thread.java:834) ~[na:na]

logging.level.io.lettuce.core.cluster: trace

めっちゃ出るので注意。

実装

基本的に、@Enable~は使う必要はない。
SpringBootの依存関係に加えて、適切なプロパティを有効化すれば良い。

Conroller

package com.example.demo.controller;

import com.example.demo.context.UserSession;
import com.example.demo.entity.Message;
import com.example.demo.repository.MessageRedisRepository;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestParam;

import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import java.util.stream.StreamSupport;

@Controller
@Slf4j
@RequiredArgsConstructor
public class TestController {

    private final UserSession userSession;
    private final MessageRedisRepository messageRedisRepository;

    @GetMapping({"/index", "/get", "/"})
    public String getIndex(Model model) {
        model.addAttribute("test", "initial value");
        return "Index";
    }

    @PostMapping("/index")
    public String postIndex(Model model, @RequestParam("test") String text) {

        log.info("before value:{}", text);
        Long id = Math.round(Math.random() * 1000000);
        Message message = new Message(id, text);
        messageRedisRepository.save(message);
        log.info("id :{}", id);

        userSession.setReceiveMessage(text);
        userSession.getMessages().add(text);

        model.addAttribute("messages", List.of("id:" + id + ", message:" + text));

        return "Index";
    }

    @PostMapping("/get")
    public String postGet(Model model, @RequestParam("id") Long id) {
        Optional<Message> message = messageRedisRepository.findById(id);
        message.ifPresent(value -> model.addAttribute("messages", List.of(
                id + ":" + value.getMessage()
        )));
        return "Index";
    }

    @GetMapping("/all")
    public String getAll(Model model) {
        Iterable<Message> messages = messageRedisRepository.findAll();
        List<String> list = Stream.of(messages)
                .flatMap(e -> StreamSupport.stream(e.spliterator(), false))
                .map(e -> e.getId() + ":" + e.getMessage()).collect(Collectors.toList());
        model.addAttribute("messages", list);
        return "Index";
    }
}

Redis 用 Entity

package com.example.demo.entity;

import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.ToString;
import org.springframework.data.redis.core.RedisHash;

import java.io.Serializable;

@Data
@AllArgsConstructor
@NoArgsConstructor
@ToString
@RedisHash("message")
public class Message implements Serializable {
    private Long id;
    private String message;
}

Redis Repository

package com.example.demo.repository;


import com.example.demo.entity.Message;
import org.springframework.data.repository.CrudRepository;
import org.springframework.stereotype.Repository;

@Repository
public interface MessageRedisRepository extends CrudRepository<Message, Long> {
}

src/main/resources/templates/index.html

<!DOCTYPE html>
<html lang="ja" xmlns:th="http://www.w3.org/1999/xhtml">
<head>
    <meta charset="UTF-8">
    <title>Test</title>
</head>
<body>
<pre>
    Please Input Test text
</pre>
<form th:action="@{/index}" method="post">
    <input type="text" name="test" th:value="${test}"/>
    <input type="submit" value="send"/>
</form>
<form th:action="@{/get}" method="post">
    <input type="number" name="id" th:value="${id}"/>
    <input type="submit" value="get"/>
</form>
<form th:action="@{/all}" method="get">
    <input type="submit" value="all"/>
</form>
<div th:each="message:${messages}">
    <div th:text="${message}"></div>
</div>
</body>
</html>

テスト(正常系)

PS C:\Users\Manager> docker run --net=redis-network -p 8080:8080 --name=spring-redis-cluster spring-redis-cluster

  .   ____          _            __ _ _
 /\\ / ___'_ __ _ _(_)_ __  __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
 \\/  ___)| |_)| | | | | || (_| |  ) ) ) )
  '  |____| .__|_| |_|_| |_\__, | / / / /
 =========|_|==============|___/=/_/_/_/
 :: Spring Boot ::                (v2.4.4)

2021-04-04 15:38:11.102  INFO 1 --- [           main] com.example.demo.DemoApplication         : Starting DemoApplication v0.0.1-SNAPSHOT using Java 11.0.10 on 6f70a8cbae1d with PID 1 (/opt/app/redis-cluster-spring.jar started by root in /)
2021-04-04 15:38:11.113  INFO 1 --- [           main] com.example.demo.DemoApplication         : No active profile set, falling back to default profiles: default
2021-04-04 15:38:14.086  INFO 1 --- [           main] .s.d.r.c.RepositoryConfigurationDelegate : Multiple Spring Data modules found, entering strict repository configuration mode!
2021-04-04 15:38:14.096  INFO 1 --- [           main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Redis repositories in DEFAULT mode.
2021-04-04 15:38:14.549  INFO 1 --- [           main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 415 ms. Found 1 Redis repository interfaces.
2021-04-04 15:38:16.177  INFO 1 --- [           main] o.s.b.w.embedded.tomcat.TomcatWebServer  : Tomcat initialized with port(s): 8080 (http)
2021-04-04 15:38:16.215  INFO 1 --- [           main] o.apache.catalina.core.StandardService   : Starting service [Tomcat]
2021-04-04 15:38:16.216  INFO 1 --- [           main] org.apache.catalina.core.StandardEngine  : Starting Servlet engine: [Apache Tomcat/9.0.44]
2021-04-04 15:38:16.361  INFO 1 --- [           main] o.a.c.c.C.[Tomcat].[localhost].[/]       : Initializing Spring embedded WebApplicationContext
2021-04-04 15:38:16.362  INFO 1 --- [           main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 4995 ms
2021-04-04 15:38:17.175  INFO 1 --- [           main] o.s.s.concurrent.ThreadPoolTaskExecutor  : Initializing ExecutorService
2021-04-04 15:38:18.520  INFO 1 --- [           main] o.s.s.concurrent.ThreadPoolTaskExecutor  : Initializing ExecutorService 'applicationTaskExecutor'
2021-04-04 15:38:20.929  INFO 1 --- [           main] .s.s.UserDetailsServiceAutoConfiguration :

Using generated security password: ef3abd4e-70ff-4626-b718-57da968d37b4

2021-04-04 15:38:21.317  INFO 1 --- [           main] o.s.s.web.DefaultSecurityFilterChain     : Will secure any request with [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@2f5b8250, org.springframework.security.web.context.SecurityContextPersistenceFilter@67e13bd0, org.springframework.security.web.header.HeaderWriterFilter@77a074b4, org.springframework.security.web.csrf.CsrfFilter@748d2277, org.springframework.security.web.authentication.logout.LogoutFilter@4f89331f, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@2cae9b8, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@4228bf58, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@4821aa9f, org.springframework.security.web.session.SessionManagementFilter@1dc3502b, org.springframework.security.web.access.ExceptionTranslationFilter@7b948f3e, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@6e4f263e]
2021-04-04 15:38:21.413  INFO 1 --- [           main] o.s.b.a.e.web.EndpointLinksResolver      : Exposing 2 endpoint(s) beneath base path '/actuator'
2021-04-04 15:38:21.693  INFO 1 --- [           main] o.s.b.w.embedded.tomcat.TomcatWebServer  : Tomcat started on port(s): 8080 (http) with context path ''
2021-04-04 15:38:22.852  INFO 1 --- [           main] s.a.ScheduledAnnotationBeanPostProcessor : No TaskScheduler/ScheduledExecutorService bean found for scheduled processing
2021-04-04 15:38:22.917  INFO 1 --- [           main] com.example.demo.DemoApplication         : Started DemoApplication in 14.056 seconds (JVM running for 16.44)

http://localhost:8080/

image.png

send

image.png

get

image.png

send

image.png
image.png

log

2021-04-04 16:04:19.448  INFO 1 --- [nio-8080-exec-4] c.e.demo.controller.TestController       : before value:initial value
2021-04-04 16:04:19.656  INFO 1 --- [nio-8080-exec-4] c.e.demo.controller.TestController       : id :621606
2021-04-04 16:05:19.171  INFO 1 --- [nio-8080-exec-6] c.e.demo.controller.TestController       : before value:aaaeeee
2021-04-04 16:05:19.214  INFO 1 --- [nio-8080-exec-6] c.e.demo.controller.TestController       : id :646105

テスト(異常系)

フェイルオーバーさせる。やり方はWindowsでDockerを用いてRedisCluster

log

2021-04-05 03:33:09.983  INFO 1 --- [ioEventLoop-4-4] i.l.core.protocol.ReconnectionHandler    : Reconnected to 172.18.0.6:7004
2021-04-05 03:33:09.996  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.047  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.092  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.134  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.166  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.183  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.262  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.285  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.314  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.347  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.392  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.420  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.434  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.481  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.498  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.525  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.554  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.585  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.600  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.646  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.665  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.707  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.734  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.765  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.788  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.810  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.840  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.855  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.884  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.915  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.932  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.966  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:10.991  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.030  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.059  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.095  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.114  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.133  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.154  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.185  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.216  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.245  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.265  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.289  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.304  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.321  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.349  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:11.365  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: Connection refused: /172.18.0.5:7003
2021-04-05 03:33:21.390  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: connection timed out: /172.18.0.5:7003
2021-04-05 03:33:22.631  INFO 1 --- [xecutorLoop-1-1] i.l.core.protocol.ConnectionWatchdog     : Reconnecting, last destination was 172.18.0.5:7003
2021-04-05 03:33:22.631  INFO 1 --- [xecutorLoop-1-4] i.l.core.protocol.ConnectionWatchdog     : Reconnecting, last destination was 172.18.0.5:7003
2021-04-05 03:33:31.418  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: connection timed out: /172.18.0.5:7003
2021-04-05 03:33:32.636  WARN 1 --- [ioEventLoop-4-1] i.l.core.protocol.ConnectionWatchdog     : Cannot reconnect to [172.18.0.5:7003]: connection timed out: /172.18.0.5:7003
2021-04-05 03:33:32.636  WARN 1 --- [ioEventLoop-4-2] i.l.core.protocol.ConnectionWatchdog     : Cannot reconnect to [172.18.0.5:7003]: connection timed out: /172.18.0.5:7003
2021-04-05 03:33:34.733  INFO 1 --- [xecutorLoop-1-2] i.l.core.protocol.ConnectionWatchdog     : Reconnecting, last destination was 172.18.0.5:7003
2021-04-05 03:33:34.733  INFO 1 --- [xecutorLoop-1-3] i.l.core.protocol.ConnectionWatchdog     : Reconnecting, last destination was 172.18.0.5:7003

get

image.png

この間もWARNは出続ける。

落としたRedis再起動後のログ

2021-04-05 03:34:22.636  WARN 1 --- [ioEventLoop-4-2] i.l.core.protocol.ConnectionWatchdog     : Cannot reconnect to [172.18.0.5:7003]: No route to host: /172.18.0.5:7003

io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: /172.18.0.5:7003
Caused by: java.net.NoRouteToHostException: No route to host
        at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
        at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:779) ~[na:na]
        at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:330) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:707) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at java.base/java.lang.Thread.run(Thread.java:834) ~[na:na]

2021-04-05 03:34:22.636  WARN 1 --- [ioEventLoop-4-3] i.l.core.protocol.ConnectionWatchdog     : Cannot reconnect to [172.18.0.5:7003]: No route to host: /172.18.0.5:7003

io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: No route to host: /172.18.0.5:7003
Caused by: java.net.NoRouteToHostException: No route to host
        at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
        at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:779) ~[na:na]
        at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:330) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:707) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) ~[netty-transport-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.60.Final.jar!/:4.1.60.Final]
        at java.base/java.lang.Thread.run(Thread.java:834) ~[na:na]

2021-04-05 03:34:25.766  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: No route to host: /172.18.0.5:7003
2021-04-05 03:34:28.876  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: No route to host: /172.18.0.5:7003
2021-04-05 03:34:32.006  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: No route to host: /172.18.0.5:7003
2021-04-05 03:34:35.116  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: No route to host: /172.18.0.5:7003
2021-04-05 03:34:38.236  WARN 1 --- [ioEventLoop-4-1] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: No route to host: /172.18.0.5:7003
2021-04-05 03:34:41.356  WARN 1 --- [ioEventLoop-4-4] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: No route to host: /172.18.0.5:7003
2021-04-05 03:34:44.476  WARN 1 --- [ioEventLoop-4-3] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: No route to host: /172.18.0.5:7003
2021-04-05 03:34:47.606  WARN 1 --- [ioEventLoop-4-2] i.l.c.c.t.DefaultClusterTopologyRefresh  : Unable to connect to [172.18.0.5:7003]: No route to host: /172.18.0.5:7003
2021-04-05 03:34:52.731  INFO 1 --- [xecutorLoop-1-4] i.l.core.protocol.ConnectionWatchdog     : Reconnecting, last destination was 172.18.0.5:7003
2021-04-05 03:34:52.731  INFO 1 --- [xecutorLoop-1-1] i.l.core.protocol.ConnectionWatchdog     : Reconnecting, last destination was 172.18.0.5:7003
2021-04-05 03:34:52.737  INFO 1 --- [ioEventLoop-4-1] i.l.core.protocol.ReconnectionHandler    : Reconnected to 172.18.0.5:7003
2021-04-05 03:34:52.749  INFO 1 --- [ioEventLoop-4-2] i.l.core.protocol.ReconnectionHandler    : Reconnected to 172.18.0.5:7003

get

image.png

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?